logoalt Hacker News

hackitup7yesterday at 3:54 AM2 repliesview on HN

I've had a similar positive experience and I'm really surprised at the cynicism here. You have a system that is good at reading tons of literature and synthesizing it, which then applies basic logic. What exactly do the cynics think that doctors do?

I don't use LLMs as the final say, but I do find them pretty useful as a positive filter / quick gut check.


Replies

EagnaIonatyesterday at 8:09 AM

This is the crux of the argument from the article.

> get to know your members even before the first claim

Basically selling your data to maximise profits from you and ensure companies don't take on a burden.

You are also not protected by HIPAA using ChatGPT.

show 1 reply
mattmanseryesterday at 8:40 AM

Because we've all used LLMs.

The make stuff up. Doctors do not make stuff up.

They agree with you. Almost all the time. If you ask an AI whether you have in fact been infected by a werewolf bite, they're going to try and find a way to say yes.

show 3 replies