logoalt Hacker News

SecretDreamsyesterday at 2:19 PM2 repliesview on HN

LLMs for medical info are good, but they're easily abuseable. I've got a friend who is an anxious mom. They use gpt/Gemini to "confirm" all of their suspicions and justify far more doctor/medical visits than is at all reasonable, while also getting access to more recurring antibiotics than is reasonable. LLMs are basically giving them the gun powder to waste the doctor's time and slam an already stressed medical system when all their kids need most of the time is some rest and soup.


Replies

ramozyesterday at 4:54 PM

Yea, I'm in a particular health community. A lot of anxious individuals, for good reason, end up posting a lot of nonsense they derived from self-influenced chatgpt conversations.

That said, when used as a tool you have power over - ChatGPT has also freed up some of my own anxiety. I've learned a ton thanks to ChatGPT as well. It's often been more helpful than the doctors and offers itself as an always-available counsel.

show 1 reply
hsuduebc2yesterday at 2:34 PM

Yeah, it’s a very powerful tool, and it needs to be used carefully and with intent. People on Hacker News mostly get that already, but for ordinary users it’s a full-on paradigm shift.

It moved from: A very precise source of information, where the hardest part was finding the right information.

To: Something that can produce answers on demand, where the hardest part is validating that information, and knowing when to doubt the answer and force it to recheck the sources.

This happened in a year or two so I can't really blame. The truth machine where you doesn't needed to focus too much on validating answers changed rapidly to slop machine where ironically, your focus is much more important.

show 2 replies