Seeking health information online and self-diagnosing with the help of artificial intelligence is becoming increasingly common, raising concerns among healthcare professionals. This trend is particularly noticeable in pediatric care, where misinterpretations of AI-generated information can lead to unnecessary anxiety and complex medical visits.
Pédiatres are finding themselves frequently correcting misunderstandings stemming from online sources. Jeff Huser Pitteloud, a member of the committee of the Vaudois Pediatric Association, experiences this several times a week in his practice. “A typical example: I receive a stressed parent who is convinced their child is unwell, while from my perspective, the child appears healthy,” the doctor illustrates.
In these cases, parents have formed opinions based on responses generated by platforms like Chat GPT, often derived from articles shared in videos on TikTok and taken out of context. “Chat GPT provided information without considering their specific situation, which caused them anxiety and I spent 45 minutes deconstructing what they had read, going through it point by point to reassure them and help them understand their child was fine,” explains Dr. Huser Pitteloud.
Sometimes, a single appointment isn’t enough. Parents continue their research, discuss their concerns with others, and return with fresh anxieties or requests for specialized opinions, adding to the complexity of care.
A Different Logic Than That of Doctors
AI is not infallible and can make incorrect diagnoses. A study conducted by the University of Oxford found that AI models don’t perform better than a standard online search, with only one-third of participants receiving a correct diagnosis.
The responses generated by AI similarly vary depending on how questions are phrased and whether the user continues to probe, much like a doctor would.
“AI lists all possibilities, while a doctor assesses the probability of those possibilities,” notes Jeff Huser Pitteloud. “AI has a lot of information, but it lacks the discernment of human evaluation.”
Fundamental Human Connection
The human connection remains essential. Sébastien Jotterand, co-president of the Swiss Association of Family Doctors and Pediatrics, hasn’t observed a significant increase in the utilize of AI for health diagnoses.
“I believe people understand that, outside of very standardized situations like emergencies, there’s always room for doubt, and that needs to be discussed. And for that, you need a human being made of flesh and blood who, like patients, will one day die,” he explains.
“Human beings need to encounter other humans. That’s what’s reassuring to me,” he concludes.
Alexandra Richard / juma