Patient expectations are changing fast. Access needs to be simple. Information needs to be timely. But care still needs to feel human.
Many practices already offer digital access, online consultations and triage tools. Patients still report feeling unheard. The problem is not the absence of digital routes — it is fragmentation, inconsistency and a sense that access models are designed to protect the system before they help the patient.
When the system can’t meet those expectations, patients look elsewhere. A recent digital health survey found that one in four UK patients (24%) are turning to AI tools or social media for health guidance1. That shift is concerning, not because patients are using AI in primary care, but because much of what they find sits outside safe, regulated NHS care. When advice is delivered without access to the patient record, without clinical oversight and without accountability, risk does not disappear. It is simply displaced, and often returns to general practice later, harder to see and harder to manage.
How have patient expectations of AI in primary care shifted?
Patients now experience clarity, momentum and choice as standard in almost every other part of life. When healthcare cannot offer the same, delays and uncertainty are interpreted not as pressure on the system, but as a lack of responsiveness to the individual. At the same time, pressure on general practice continues to grow as demand rises but workforce capacity remains stretched. For patients, phone lines are still congested and digital routes do not work equally well for everyone.
This creates a growing gap. Patients want reassurance that their request is moving forward, while practices need to control demand without adding risk or unsustainable work. AI in primary care now sits at the centre of that tension. The risk is not AI itself, but patients turning to unregulated platforms or generic tools where advice may be inaccurate or misleading. On these platforms, there is no connection to the patient record, no clinical oversight, and no clear accountability. The NHS still absorbs the downstream impact, but without visibility or control.
Primary care has an opportunity to respond differently. Not by blocking use of AI in primary care, but by offering safe, trusted tools within NHS pathways that help patients help themselves without stepping outside the system.
How AI in primary care can support different access needs
One of the most persistent mistakes in digital access is assuming patients want the same thing. Some are confident online and prefer to self-serve if it saves time, others want to speak to someone, some cannot use digital tools at all. Many move between channels depending on urgency, confidence or circumstance. Meeting expectations means designing for that reality, not pushing everyone down a single route.
Used appropriately, digital self-serve tools like Surgery Assist can resolve straightforward admin queries, provide guidance and reduce unnecessary contact by signposting patients to the right care. Placing smart navigation tools ahead of reception helps to free up capacity where it is needed most.
But equitable access also means recognising that the phone remains central. Digital access is not limited to screens and forms. Voice-based AI tools can help to end the digital divide, supporting patients who choose to call by capturing information clearly and consistently as part of the conversation. For example, our Omni Consult voice agent can gather information for an Online Consultation request over the phone, automatically filing this into the clinical record. Patients describe their needs in their own words, no typing, no navigation, no requirement for digital confidence. Crucially, requests can only be submitted by registered patients of the practice and those sent in via phone enter the same triage workflow as those submitted online.
That consistency matters. It means patients who prefer to call are not disadvantaged while practices spend less time filling gaps or chasing missing details. Triage decisions are based on comparable information, regardless of how the request was made.
Meeting expectations during the consultation with AI in primary care
Patient expectations don’t stop at access. They continue into the consultation itself. Patients notice when attention shifts to the screen, when typing interrupts the conversation, and when interactions feel more task-focused than personal.
AI in primary care can change this. Used safely, ambient voice technology captures key information during the consultation, reducing the need for constant note-taking and allowing clinicians to focus on listening and responding in the moment. For Surgery Intellect, powered by TORTUS, this applies across both face-to-face appointments and telephone consultations, where capturing detail accurately without disrupting the flow can be more challenging. For patients, the impact is simple: conversations feel more natural, the clinician’s attention is more clearly on them, not the system.
This is also where expectations around safety are highest. Clinical safety has been one of the biggest barriers to adoption of ambient voice technology in primary care. Practices need confidence in how data is captured, stored and used, and how AI-generated outputs fit within existing clinical responsibility and governance. That caution is well founded. Without clear safety standards, ambient tools risk creating more uncertainty rather than reducing it.
This is why national guardrails around AI in primary care matter. The launch of the NHS England Ambient Voice Technology Self-Certified Supplier Registry provides clearer expectations around safety and governance. As a self-certified supplier on the registry, Surgery Intellect is one of the tools helping practices move from curiosity to confidence, knowing that adoption is aligned with national clinical safety principles.
Meeting expectations here is not about speed. It’s about presence, clarity and trust. When technology supports clinicians to listen more fully, and its use is backed by strong safety assurance, patients can be confident their concerns are heard and recorded properly.
Safe, equitable AI in primary care needs strong foundations
Trust in AI depends on how it is governed.
For AI in primary care, that means clear data protection, robust clinical safety processes and alignment with NHS standards. It also means recognising that practices need practical support to adopt new tools safely, not additional complexity.
Clear risk assessment, transparent governance and support materials aligned to standards such as DCB0160 help practices understand how AI fits into existing safety processes. To support DCB0160 compliance, we provide a Clinical Safety Hazard Log template for our AVT tool Surgery Intellect, powered by TORTUS. This reduces uncertainty and enables informed decision-making, rather than cautious avoidance.
Also, equity must remain central to AI in primary care. When self-serve options work well, they reduce pressure on phone lines for those who rely on them. When voice-based AI supports callers, it removes barriers rather than creating new ones. When clinicians are supported during consultations, patients experience better, more focused care.
The future of access is digital by default, not digital-only. Voice, online and in-consultation tools working together within a single, governed system. Patients are already reshaping how they seek help. The opportunity for primary care now is not to resist that change, but to shape it safely, fairly and on your own terms.
Want to learn more about what AI in primary care can do for your practice and how to make it work safely, securely and on your terms?