There’s a lot of negativity here. I’ll just say I’m extremely glad I had ChatGPT when I was going through some health issues last year.
Many, many, many doctors (including at a top-rated children's hospital in the US) spent 4+ years unsuccessfully trying to diagnose a very rare disease that my younger daughter had. Scores of appointments and tests. By the time she was 13, she weighed 56 lbs (25 kg) and was barely able to walk 100 yards. Psychiatrists even tried to imply that it was all imaginary and/or that she had an eating disorder.
Eventually, one super-nerdy intern walking rounds with the resident in the teaching hospital remembered a paper she had read, mentioned it during the case review, and they ran tests which confirmed it. They began a course of treatment and my daughter now lives normally (with the aid of daily medication.)
I fed a bunch of the early tests and case notes to ChatGPT and it diagnosed the disease correctly in minutes.
I surely wish we had had this technology a dozen years ago.
(I know, the plural of anecdote is not data.)
Yep same, with the caveat that any actionable advice requires actual research from reliable sources afterwards (or at least making it cite sources).
i mean; i kinda get the concerns about misleading people but … are people really that dumb? okay if it’s telling you to drink more water, common sense. If you’re scrubbing up to perform an at home leg amputation because it misidentified a bruise then that’s really on you.
Same here, right now (couldn't get up without numb back pain, can barely walk, ChatGPT educated me on the quadratus lumborum muscle and how to solve that ... which was a lot better than my brain going 'well, I'm wheelchair-bound'.
Same here. It’s a double-edged sword, though. I know some people who work in health care, including some doctors. They deal with a lot of hypochondriacs — people who imagine they have all sorts of issues and then try to MacGyver themselves to better health. You can’t read an HN thread on health care issues without dozens of those coming out of the woodwork to share their magical, special way of beating the system. Silicon Valley has a long history of people that did all sorts of weird crap. There's a great anecdote about Steve Jobs turning orange when he was restricting himself to a diet of carrots because he believed god knows what. In the end he died young of pancreatic cancer. Probably not connected but smart person that did some wacky stuff that probably wasn't that good for him.
I'm on statins that have side effects that I'm experiencing. That's a common thing. ChatGPT was useful for me to figure out some of that. I've had other minor issues where even just trying to understand what the medication I'm being prescribed is supposed to do can be helpful. Doctors aren't great at explaining their decisions. "Just take pill x, you'll be fine".
Doctors have to diagnose patients in a way that isn't that different from how I would diagnose a technical issue. Except they are starved for information and have to get all their information out of a 10-15 minute consult with a patient that is only talking about vague symptoms. It's easy to see how that goes wrong sometimes or how they would miss critical things. And they get to deal with all the hypochondriacs as well. So they have to poke through that as well and can't assume the patient is actually being truthful/honest.
LLMs are useful tools if you know how to use them. But they can also lead to a lot of confirmation bias. The best doctors tell you what you need to hear, not what you want to hear. So, tools like this are great and now a reality that doctors need to deal with whether they like it or not.
Some of the Covid crisis intersected with early ChatGPT usage. It wasn't pretty. People bought into a lot of nonsense that they came up with while doom scrolling Reddit, or using early versions of LLMs. But things have improved since then. LLMs are better and less likely to go completely off the rails.
I try to look at this a bit rationally: I know I don't get the best care possible all the time because doctors have to limit time they spend on me and I'm publicly insured in Germany so subject to cost savings. I can help myself to some extent by doing my homework. But in the end, I have to trust my doctor to confirm things. My mode is that I use ChatGPT to understand what's going on and then try to give my doctor a complete picture so he has all the information needed to help me.
I know someone who used ChatGPT to diagnose themselves with a rare and specific disease. They paid out of pocket for some expensive and intrusive diagnostics that their doctor didn't want to perform and it came out, surprise, that they didn't have this disease. The faith of this person in ChatGPT remains nonetheless just as high.
I'm constantly amazed at the attitude that doctors are useless and that their multiple years of medical school and practical experience amounts to little more than a Google search. Or as someone put it, "just because a doctor messed up once it doesn't mean that you are the doctor now".