MedEd AI Epoch: 129   1,000 Diseases? Dr AI Will See You Now

MedEd AI Epoch: 129 1,000 Diseases? Dr AI Will See You Now

A headline like this “1,000 diseases? Dr AI will see you now” immediately catches the eye. When I came across this piece in the Metro (September 18, 2025), I found myself pausing. The article described how researchers trained an AI tool, Delphi-2M, on 400,000 anonymised UK patient records. The system could predict an individual’s risk of developing over 1,000 diseases in one go. It’s the sort of news that sounds like science fiction. Yet, it’s happening now.

The promise is clear: helping doctors anticipate, prevent, and treat disease before it strikes. But the reality is complex, especially in the UK context where our NHS is under immense strain, and conversations about digital health, prevention, and patient empowerment are no longer optional they are urgent. As someone deeply engaged in AI, mixed reality, and digital health in medical education, I couldn’t help but reflect.

The Story That Struck Me

Imagine walking into your GP’s office and, instead of vague lifestyle advice, you’re told: “You have a high probability of developing cardiovascular disease in the next 10 years, but here are two precise interventions that could change your trajectory.”

That’s not just technology that’s personalised hope. But it’s also unsettling. It raises questions about how much we want to know, how ready we are to act, and how well-prepared our doctors and future healthcare professionals are to interpret such predictions responsibly.

This news made me pause and reflect on three powerful questions I’d like to ask you:

  1. Would you really want to know your probability of developing hundreds of diseases —and would it empower you, or overwhelm you?
  2. How ready are our healthcare systems, particularly the NHS, to integrate AI responsibly while maintaining compassion, equity, and trust?
  3. Are we preparing our young healthcare professionals with the right AI literacy to understand, critique, and use these tools ethically in practice?

In my academic and professional journey, my focus has been on how AI and immersive technologies (like mixed reality and VR) can transform medical education, clinical reasoning, and patient care. My aim is not just to integrate tools, but to foster critical awareness helping budding doctors, dentists, and allied health professionals to ask the right questions, not blindly accept what an algorithm tells them.

We need a generation of clinicians who are as comfortable questioning AI outputs as they are reading lab results clinicians who see AI not as a replacement, but as a partner in decision-making.

A Call for AI Literacy

This article reminded me why I am so committed to embedding AI literacy into medical curricula. It’s not only about the technology it’s about cultivating a mindset of curiosity, scepticism, and responsibility. Because the reality is: AI will be in the consultation room. The real question is: will our future clinicians be ready to meet it? I’d love to hear your thoughts on those three questions above. Let’s open the discussion because the answers will shape the future of healthcare, not just in the UK, but globally.


Nigel Adams

Professor & Director, Buckingham Enterprise & Innovation Unit (BEIU), Vinson Building, University of Buckingham. BA (Hons) FCIM

4d

A great question Shazia. We all must keep working hard to ensure that you and others who "get it" are heard!

Adeniyi Akiseku

Women's Health Researcher | Obstetrician & Gynaecologist Specialist | Driving Innovation and Equity in Digital Health | Medical Educator

5d

Completely agree. For clinicians, the challenge will be using AI as a supportive partner, helping with speed and accuracy. While never losing sight of empathy, communication, and the human touch that patients value most.

To view or add a comment, sign in

Explore content categories