My eyes slowly fluttered open. “Is it all over?” I asked, scanning the sterile room for a friendly face.
It was, the anaesthetician hovering over me confirmed. I wasn’t imagining it: for the first time in years, my body felt pain-free.
“Thank you so much, thank you so much,” I repeated on loop, as my bed was wheeled to the recovery ward. The words kept flowing even after the nurse closed the flimsy blue curtain around my bed and left me to rest.
While the many medical professionals who crossed my path that day likely chalked my blabbering up to the sedatives in my system, I meant every word of it. Not only had a simple procedure removed all of my mind-numbing pain, it had saved my life.
But things could have easily turned out differently.
After almost a hundred GP appointments, ultrasound scans, blood tests and tentative (incorrect) diagnoses, I had finally found the answer to the source of my agony: a rare condition called chronic cholecystitis, which causes repeat inflammation of the gallbladder. For me, it had reached the point where the organ had stopped working entirely and a 3cm stone had formed inside. Without urgent surgery, I had just a 40% chance of survival.
Everyone's clicking on...
Only it wasn’t a doctor who first suggested chronic cholecystitis could be the cause of my suffering. It was AI: or more specifically, ChatGPT.
It was after yet another disappointing meeting with my doctor and a set of blood test results were deemed ‘normal’ – in spite of me feeling anything but – that saw me turn to a chatbot over a real-life medical professional.
Sat in my room, ready to face another afternoon of pain without answers, out of sheer desperation I typed in my symptoms to ChatGPT and uploaded my blood test results. I waited tentatively for a response, trying to put any worries about sharing such intimate data to the back of my mind.
One of my blood tests, which checked for an inflammation marker called C-reactive protein, was borderline high a few days after a 16-hour painful flare-up, but my GP said they didn’t see a need to explore it further. ChatGPT did, however.
“In the context of severe abdominal pain, this could mean very early-stage inflammation,” it told me, and suggested a multitude of options I might want to consider pursuing for further tests. It gave me a list of potential conditions to look at – among them was gallbladder disease (another name for chronic cholecystitis), which was the first time it had ever been suggested to me.
I immediately booked a follow-up appointment with my GP. She was reluctant to send me to a gastroenterologist, but when I casually asked whether it could be something to do with my liver, kidneys or gallbladder, one route of inquiry we hadn’t yet tried – without mentioning I’d ChatGPTed it – she booked me in for an upper abdominal and back ultrasound “just in case”.
It was this ultrasound that revealed the extent of my gallbladder disease. Later, another doctor then confirmed how "dangerous" it was, given it had been left untreated for so long.
In the moment, the long-term implication of giving away such personal health data didn’t feature at the forefront of my mind. I was just desperate for answers.
Goodbye GP and hello ChatPT?
It’s no secret that the NHS is facing immense pressure right now. In September 2025, the Referral to Treatment waiting list had 6.24 million individuals on it, with 180,300 of those patients waiting more than a year.
For women, it can feel even harder to have your voice heard when it comes to health concerns, thanks to medical misogyny being alive and well in the UK. One recent report by the Women and Equalities Committee revealed women frequently have their ‘pain dismissed’ when seeking help. Women’s health was also only introduced as a required module for medical students in 2024 - with ‘particular emphasis’ on training for GPs.
So, is it any wonder then that 15.1 million of us in the UK used ChatGPT in July 2025 alone – with health advice-seeking ranking as the fourth most popular reason why?
Laura, 29, is one such person. For years, she’d experienced debilitating period pain, bloating, nausea and pain during sex, but claims she was routinely dismissed by medical professionals, who chalked it up as “part of being a woman.”
Eventually, after pushing for an MRI and an exploratory laparoscopy - the most effective way of diagnosing endometriosis (a chronic illness where tissue similar to that in the lining of the womb grows elsewhere in the body), via a small camera being inserted into the stomach - Laura was told she had no abnormalities. She felt deflated and confused, and unclear on what to do next.
“They said there was nothing to worry about; it was all fine,” Laura recalls. “To be honest, I felt like from the beginning I wasn’t asking the right questions, so I wasn’t getting the answers I needed.
“There was an expectation that I would understand everything that was going on with me. But before ChatGPT, I didn’t.”
Refusing to accept that what she was facing was ‘normal’, Laura sent through her MRI and laparoscopy results to ChatGPT. She claims the AI noted signs of bowel endometriosis, and prompted her to ask for a second opinion. When she asked ChatGPT how to go about this, it provided a template letter to send to her GP.
Since seeking a second opinion, Laura received a diagnosis of endometriosis and is currently on the waiting list for surgery to hopefully help ease some of her symptoms.
ChatGPT isn’t a diagnostic tool, and, just like Googling your symptoms, it doesn’t get everything right – but OpenAI proudly claims the chatbot can help people advocate for better healthcare.
A spokesperson for OpenAI told Cosmopolitan UK: “ChatGPT is not intended to replace qualified clinicians, but it can provide information to help people understand and advocate for their care.
“We know people turn to ChatGPT for these reasons, and work closely with health experts to stress-test our models, identify risks, and improve responses.”
OpenAI has built a ‘Global Physician Network,’ in order to inform their research. The network is made up of nearly 300 physicians and psychologists and the company claims it helps advise them on supporting safe interactions between ChatGPT and the person using it.
Women’s health expert, Dr Shirin Lakshmi, who worked for the NHS for four years, says it’s “unsurprising” women are turning to ChatGPT in the wake of systemic misogyny. But makes plain that it isn’t foolproof, and will never be able to rival face-to-face consultations with qualified medical professionals.
“When it comes to healthcare, women notoriously find it more difficult to get the help they need," she said.
“So it’s no surprise that some women are turning to AI to give health advice because they aren’t getting enough from the NHS.
“[While ChatGPT] can be a powerful tool for both professionals and patients alike; if you are already anxious when it comes to your health, it can actually worsen your experience in comparison to speaking with a doctor in-person.
“Sometimes you need to rely on a doctor’s instincts when things don’t add-up. It’s vital that we don’t replace the human element when seeking a diagnosis – at risk of worsening a patient’s health anxiety and possibly making situations worse than they ever needed to be.”
AI-induced anxiety
Lydia, 26, is firm in her belief that ChatGPT is a double-edged sword. While she partially credits the site with helping her push for an Obsessive Compulsive Disorder diagnosis, she also warns that it hugely exacerbated her condition by allowing her to seek reassurance for hours on end. Others with anxiety report similar sentiments.
“I don’t think ChatGPT is all bad - it helped me to distinguish between my OCD symptoms and generalised anxiety symptoms, when I was misdiagnosed in 2024,” Lydia shares. “But here’s the problem: you’ve got two parts of OCD, obsessions and compulsions.
“The obsession involves having intrusive thoughts about a particular subject, like health, morality or cleanliness. Then, the compulsions involve rituals or behaviours which bring the anxiety down.
“When you’re in therapy for OCD, your human therapist won’t let you ‘do reassurance,’ as the long-term goal is living a life without anxious or intrusive thoughts. ChatGPT doesn’t have that built into its system, so you could be sat ‘talking’ to it constantly, all day, reassurance-seeking about different symptoms – and it’ll happily answer you each time.”
Although it can provide temporary relief, it’ll never be enough, she adds. “It just reinforces your beliefs in the long-term.”
Multiple anonymous Reddit users on subreddit r/Anxiety have even claimed ChatGPT misdiagnoses have led to panic attacks and hospitalisations. One user described checking their symptoms with the chatbot daily for eight months, after it convinced them they had a degenerative spine condition. Now, they believe “they were creating 50% of (their) aches” themselves.
Another user said ChatGPT ‘misdiagnosed’ their chilblains, a condition which causes inflamed or swollen patches on hands or feet, as lupus.
Concerns extend beyond anxiety and incorrectly labelling physical ailments too: researchers and psychiatrists are raising the alarm around chatbot or AI-induced psychosis. This can see users start to believe a chatbot is their partner, or even a god – and some early reports claim that anyone can experience this, irrespective of your mental health history.
Mental health charity Anxiety UK highlights that using AI for symptom-checking comes with notable risks to those already diagnosed with health anxiety and have urged people to proceed with caution.
A spokesperson told Cosmopolitan UK there are particular concerns involving AI’s tendency to “hallucinate” (essentially: churning out nonsensical answers not tailored to an individual’s medical history), as well as its increasingly sophisticated responses – which mean it may not immediately be obvious to the user that they’re communicating with a non-human service.
While ChatGPT has been an incredible tool for me personally when it comes to gathering information about my symptoms, and I still use it now to help me navigate bumps in the road post-operation (for example, how to manage an infection in one of my wounds when my GP and the NHS 111 service were not especially helpful), it’s clear there are downsides.
Despite nudging me in the right direction, I can’t entirely credit ChatGPT with my diagnosis. My unwavering self-advocacy, something that is human through and through, won me that.
If you’re struggling with health anxiety, Anxiety UK can be contacted via their Infoline number: 03444 775774.














