The scenario
A clinical event occurs involving a patient whose notes were generated by an ambient scribe.
The clinician's MDO asks how the AI was used, whether the output was reviewed, what system was in use, and whether the clinic had a protocol for AI use.
Some evidence may exist, but it is not yet documented in a clear, reviewable form.
Board-level oversight. What the board needs to evidence and own, and what a board AI review should cover, is on our Board AI Review page.
What ELSA AI does
- Produces a Board Findings Report written in plain English for senior readers, not technical specialists
- Delivers a one-page RAG Exposure Map the board can review, discuss and minute as part of its AI risk oversight
- Produces an AI Tool Inventory, AI Risk Register and Approved / Conditional / Prohibited Use Matrix through the Launchpad, structured for formal board adoption
- Provides periodic board-facing AI risk updates through the AI Exposure Sentinel™ retainer for clinics that want ongoing oversight support
ELSA AI does not provide legal advice, CQC certification, ICO approval, insurer coverage advice or clinical safety case sign-off. Board accountability and final approval of the AI governance position remain with the clinic’s directors, partners and responsible officers.
MDO Query
Can ELSA AI tell us whether our MDO will support a claim involving AI?
No. ELSA AI does not provide MDO indemnity advice and does not determine whether a specific AI use will affect indemnity support.
We help clinics evidence how AI use is approved, supervised, documented and controlled, including human review, staff guidance, patient data exposure, incident reporting and disclosure-readiness materials.
Final MDO conversations and indemnity decisions remain with the clinician, clinic and MDO.
Advisory governance support only. Not legal advice, MDO indemnity advice, insurer coverage advice, CQC certification, ICO approval or clinical safety case sign-off.