CQC inspection approaching
Can you show how AI tools are governed, monitored and reviewed?
Best next step
Board Findings Report + RAG Exposure Map
For private GP, dental and specialist clinics using AI with patient data
Founder-delivered by a CISM and CRISC-certified AI governance consultant with 20+ years' experience in regulated environments. Led by Faisal Ali, Founder and Principal Consultant of ELSA AI.
ELSA AI helps clinics discover which AI tools are already in use, where patient data may be involved, what evidence is missing, and what should be prioritised in the next 30 days.
Starting point: Clinical AI Exposure Diagnostic™
4 working days from completed intake · Fixed fee £4,500–£6,500 + VAT
A Royal College of Physicians snapshot survey, January 2026, found that 69% of 305 UK physician respondents used personal access to AI tools such as ChatGPT and Microsoft Copilot for clinical questions.
Led by Faisal Ali, CISM, CRISC, Founder and Principal Consultant of ELSA AI, with more than two decades of experience across cybersecurity, information risk and AI governance in regulated environments.
COMMON TRIGGERS
Private clinics usually contact ELSA AI when informal AI use becomes an evidence question, from a CQC inspection, DPO request, insurer renewal, MDO query, ambient scribe rollout or board review.
Each trigger points to the same issue: can the clinic show what AI is in use, what patient data may be involved, what evidence exists and what needs action next?
Can you show how AI tools are governed, monitored and reviewed?
Best next step
Board Findings Report + RAG Exposure Map
Can you answer AI, data protection and clinical oversight questions accurately?
Best next step
Disclosure Readiness Note
Can you show which AI tools process patient data and whether DPIA review is needed?
Best next step
DPIA Readiness and Patient Data Exposure Note
Do you have DPIA readiness, vendor evidence and patient transparency before routine use?
Best next step
Ambient Scribe Assessment Sheet
Can clinicians evidence that AI use was approved, supervised and documented?
Best next step
MDO, PMI and Insurer Disclosure Readiness Note
Can leadership see what AI is in use, what risk exists and what action is required?
Best next step
Board Findings Report + 30-Day Priority Action Plan
The Clinical AI Exposure Diagnostic™ gives clinic leadership a board-ready view of AI use, patient-data exposure, evidence gaps and priority actions in four working days from completed intake.
Fixed fee £4,500–£6,500 + VAT.
Advisory governance support only. Not legal advice, CQC certification, ICO approval, insurer coverage advice, MDO indemnity advice or clinical safety case sign-off. Final decisions remain with the clinic's accountable officers and advisers.
ELSA AI works with four clinic types where AI exposure is real, regulated and already in motion.
CQC-regulated · Primary segment
For private GP, executive health and GP-led clinics using or evaluating ChatGPT, Copilot, ambient scribes, transcription or admin AI with patient data.
Pre-rollout or live · High urgency
For clinics using or evaluating ambient scribes and AI note tools where consultation audio, identifiers and clinical text need a documented governance position.
Single-site or group
For dental practices and groups using AI imaging, note drafting, transcription, patient communications or office AI where patient data may be involved.
Dermatology · Aesthetics · Diagnostics · Fertility · Ophthalmology
For doctor-led specialist clinics using AI with patient images, consultation notes, correspondence or clinical workflows.
ELSA AI is designed for CQC-regulated private GP, dental and specialist clinics where AI is already being used with patient data – for example ambient scribes, ChatGPT, Copilot, imaging AI or supplier platforms.
It is a good fit where there is at least a small clinical team, formal CQC registration and DPO/board interest in AI governance. Very small, single-handed practices using only basic office automation may be better served by simple policy templates rather than a full Diagnostic.
A focused 4-working-day assessment showing what AI is being used, where patient data may be involved, what evidence is missing and what should be prioritised next.
Step 1
Day 1
Leadership intake, evidence request, confidential role-level staff survey and initial shadow AI mapping.
Step 2
Days 2–3
Review AI tool inventory, patient-data exposure, DPIA readiness, vendor evidence, ambient scribes where applicable, human oversight and disclosure-readiness indicators.
Step 3
Day 4
Board Findings Report, RAG Exposure Map, 30-Day Priority Action Plan and source-mapped evidence appendix.
A board-ready evidence pack showing what AI is in use, what patient data may be involved, what evidence exists, what is missing and what should happen next.
Evidence & guides
AI governance pressure usually arrives as a request for evidence: from the DPO, board, insurer, MDO, CQC inspector, clinical lead or patient. These guides explain what private clinics may need to have ready before AI use becomes difficult to explain.
A practical checklist of the board, DPO, vendor, patient transparency, ambient scribe, insurer and incident evidence a clinic may need when AI touches patient data.
View checklist →A focused guide for clinics using or planning Heidi, Tortus, Accurx Scribe, Dragon DAX, Tandem, Nabla, Otter or similar tools.
Review scribe checklist →What a DPO may ask for when AI tools process patient data, including DPIA indicators, vendor evidence, privacy notices and Records of Processing Activities.
Read the DPO guide →Advisory governance support only. These guides are not legal advice, DPIA sign-off, CQC certification, ICO approval, insurer coverage advice, MDO indemnity advice or clinical safety case sign-off.
The Diagnostic does not claim to fix every AI risk in four working days.
It gives leadership a documented starting position: what AI is in use, what patient data may be involved, what evidence is missing and what should be prioritised next.
Example outcomes
Illustrative scenarios based on typical clinic profiles, not specific clients.
From unknown AI use to a board-readable exposure map
A GP-led executive health clinic identifies declared and informal AI use across clinical, admin and support teams. Leadership receives an AI Tool and Use-Case Inventory, RAG Exposure Map and 30-Day Priority Action Plan showing which tools need DPO review, vendor evidence or staff guidance first.
Preparing for ambient scribe rollout
A doctor-led specialist clinic preparing to use an ambient scribe receives a structured view of DPIA readiness, vendor evidence gaps, patient transparency wording needs, human-review workflow and clinical safety ownership points for review by its DPO, clinical lead and accountable officers.
Moving from shadow AI to approved-use guidance
A dental group finds staff using personal AI tools for drafting, notes and admin support. The Diagnostic helps leadership distinguish approved, conditional and prohibited use, identify patient-data exposure risks and prioritise staff guidance, vendor evidence and DPO review actions.
Each scenario leads to the same starting point: a documented AI governance position the clinic can review, own and act on.
Diagnostic first. Launchpad second. Sentinel third. ELSA AI starts with a documented view of current exposure, then helps clinics build and maintain a working governance baseline.
Step 1
Starting point
Identify what AI is in use, where patient data may be involved, what evidence is missing and what actions should be prioritised in the next 30 days.
Step 2
Convert Diagnostic findings into a board-approved governance baseline: policy, register, risk register, DPIA readiness pack, vendor evidence, patient transparency, staff guidance, incident process and board evidence pack.
Step 3
Keep AI governance evidence current as tools, staff use, vendor terms, insurer questions and regulatory expectations change.
Confidential discovery call with Faisal Ali. No commitment required.

Founder and Principal Consultant, ELSA AI
ELSA AI engagements are led by Faisal Ali, CISM, CRISC, Founder and Principal Consultant of ELSA AI. Faisal brings more than two decades of experience across cybersecurity, information risk and AI governance in regulated environments.
ELSA AI was built for private healthcare providers deploying third-party AI tools, not building AI products from scratch. The focus is practical evidence: what tools are in use, what patient data may be involved, what controls exist, who owns the risk and what decision-makers need to see.
ELSA AI structures evidence so the clinic's own accountable officers and advisers can review, adopt and own the final position.
The starting point is a confidential 20-minute conversation.
We will confirm whether the Clinical AI Exposure Diagnostic™ is the right fit for your clinic, what tools and workflows should be in scope, and whether there is a time-sensitive trigger such as an ambient scribe rollout, DPO review, insurer renewal, MDO question, board meeting or CQC inspection.
20 minutes
Direct with Faisal Ali
No commitment required
Confidential · No obligation · Senior-led from the first call
Advisory governance support only. Not legal advice, regulatory approval, CQC certification, insurer coverage advice, MDO indemnity advice or clinical safety case sign-off. Where needed, evidence is structured for adoption and sign-off by the clinic's own legal advisers, clinical safety officers and indemnity providers.