“Over three decades, Google designed and delivered a search engine where credible and accessible health content could rise to the top of the results.
“Searching online for information wasn’t perfect, but it usually worked well. Users had a good chance of clicking through to a credible health website that answered their query.
“AI Overviews replaced that richness with a clinical-sounding summary that gives an illusion of definitiveness.
“It’s a very seductive swap, but not a responsible one. And this often ends the information-seeking journey prematurely. The user has a half answer, at best.
“I set myself and my team of mental health information experts at Mind a task: 20 minutes searching using queries we know people with mental health problems tend to use. None of us needed 20.
“Within two minutes, Google had served AI Overviews that assured me starvation was healthy. It told a colleague mental health problems are caused by chemical imbalances in the brain. Another was told that her imagined stalker was real, and a fourth that 60% of benefit claims for mental health conditions are malingering. It should go without saying that none of the above are true.
‘Very dangerous’: a Mind mental health expert on Google’s AI Overviews