logoalt Hacker News

taikonyesterday at 11:04 PM5 repliesview on HN

It's honestly such a big problem. One of my colleagues uses an AI scribe. I can't rely on any of his chart notes because the AI sometimes hallucinates (I've already informed him). It also tends to write a ridiculous amount of detail that are totally unnecessary and leave out important details such that I still need to comb through patient charts for (med rec, consults, etc). In the end it ends up creating more work for me. And if my colleague ever gets a college complaint I have no clue how he's gonna navigate any AI generated errors. I'm all for AI and it's great for things like copywriting, brainstorming and code generation. But from what I'm seeing, it's creating a lot more headache in the clinical setting.

If you're why doesn't this guy just check the AI scribe notes? Well, probably because with the amount of detail it writes, he'd be better off writing a quick soap note.


Replies

batshit_beavertoday at 12:15 AM

> I'm all for AI and it's great for things like copywriting, brainstorming and code generation

It's funny how the assumption is always that LLMs are very useful in an industry other than your own.

rcontiyesterday at 11:20 PM

It feels very much like AI is creating AI lock-in (if not AI _vendor_ lock-in) by creating so much detailed information that it's futile to consume it without AI tools.

I was updating some gitlab pipelines and some simple testing scripts and it created 3 separate 300+ line README type metadata files (I think even the QUCIKSTART.md was 300 lines).

_AzMootoday at 12:37 AM

My (extensive) experience with LLM code generation is that it has the same issues you describe in your field. Hallucinations, over-engineering, misses important requirements/patterns.

But engineers have these same problems. The key is that the content creator (engineers for codegen, doctors for medicine) is still responsible for the output of the AI, as if they wrote it themselves. If they make a mistake with an AI (eg, include false data - hallucinations), they should be held accountable in the same way they would if they made a mistake without it.

show 1 reply
acuozzotoday at 12:29 AM

> I'm all for AI and it's great for things like copywriting, brainstorming and code generation

That's funny. I would have said the same thing about your field prior to reading your comment.

dmtroyertoday at 2:39 AM

sounds like they need a better instructions.md