The scenario
The board wants to establish or review its AI governance position.
Staff are using AI tools the board cannot name with confidence. There is no AI tool register, no board-approved AI usage policy and no formal AI risk register entry.
The board cannot yet answer a straightforward question from an insurer, commissioner, DPO, MDO or regulator about what AI is in use and how it is controlled.
What board AI review should cover
- Clarity on which AI tools are in use and who approved them
- Alignment of AI use with organisational strategy and clinical governance frameworks
- Whether DPIA, vendor, clinical safety and DTAC-style evidence has been assessed where applicable
- Named accountability for AI governance and clinical workflow oversight
- AI risk captured on the organisational risk register with identified mitigations
- Evidence that AI-related risks and incidents are monitored and reported at governance level
- Plans for monitoring safety, benefit, adoption and unintended consequences
- Patient transparency and communication arrangements
- Definition of where AI supports decisions, documentation or workflow — and how patient questions, objections or concerns are handled
What ELSA AI does
- Produces a Board Findings Report written in plain English for senior readers, not technical specialists
- Delivers a one-page RAG Exposure Map the board can review, discuss and minute as part of its AI risk oversight
- Produces an AI Tool Inventory, AI Risk Register and Approved / Conditional / Prohibited Use Matrix through the Launchpad, structured for formal board adoption
- Provides periodic board-facing AI risk updates through the AI Exposure Sentinel™ retainer for clinics that want ongoing oversight support
ELSA AI does not provide legal advice, CQC certification, ICO approval, insurer coverage advice or clinical safety case sign-off. Board accountability and final approval of the AI governance position remain with the clinic’s directors, partners and responsible officers.
Board AI Review
Is AI governance just an IT or DPO issue?
No. AI governance involves IT and the DPO, but it is also a board, clinical governance, patient safety, professional accountability and operational risk issue.
Boards and partners need visibility of what AI is in use, where patient data may be involved, what evidence gaps exist, who owns the risks and what actions are required.
ELSA AI turns scattered AI use into a board-readable position so leadership can make proportionate decisions based on evidence rather than assumptions.
Advisory governance support only. Not legal advice, MDO indemnity advice, insurer coverage advice, CQC certification, ICO approval or clinical safety case sign-off.