logoalt Hacker News

Refuse to let your doctor record you

118 pointsby speckxtoday at 3:48 PM126 commentsview on HN

Comments

burntetoday at 4:16 PM

I'm a healthcare CIO of 12 years, and I've evaluated 4 and deployed 2 of these tools, one of which is currently deployed at my currently healthcare employer. I am very measured on AI but the results I've seen from these virtual scribes is HUGE. In every case we have IMMEDIATELY seen improvements in patient NPS scores, provider satisfaction, and note quality. Notes are more standardized as well as more verbose and detailed, which makes it easier for future providers to understand the case. These better notes reduce our claim rejection rate.

And what converted me was direct patient response. Across the board patient feedback is extremely positive, with the most common comment being along the lines of "I really felt like the doctor connected with me better and they were more present in the visit."

These AI scribes really DO improve patient care, I've seen it with my own eyes.

show 17 replies
0xbadcafebeetoday at 6:53 PM

I think this is just humans not understanding things and irrationally being afraid of one thing more than another thing. You're afraid of someone listening to you; you're not afraid of someone copying the documents that detail every single aspect of your health (EHR records).

Healthcare records are probably the most strongly protected personal information in the world. Remember that most of the data about you is not protected by law. Credit reports, ISP records (including your SS#), your entire email archive, Google Drive, etc could get leaked, and for the most part there's no legal consequence. But if a record of you having the flu in 3rd grade gets leaked by a 3rd party connected to health record keeping, there are real consequences (not only for the leak, but even for not reporting it).

If anything, I want everything I say to be recorded and kept on file for later reference. The danger of speech-to-text engine transcribing incorrectly is real, but that doesn't mean I don't want the notes there. I just want the audio included with the text. Both will be useful to refer to later on, especially as STT models improve their accuracy (we've seen amazing leaps in accuracy in just 1 year).

However, we do need to ensure that these records are protected from government over-reach. Currently the government can request your health records, without notifying you, for a slew of reasons. This enables the government to go on a fishing expedition, doing the equivalent of an unreasonable search of private information, and you will have no notification and no way to respond. We must create laws that provide stronger privacy rights for sensitive health information to resist government overreach. Another legal hole is 3rd party apps that collect sensitive health information, but aren't provided by your doctor. Your step-tracking, heart-monitoring app is not protected by HIPAA. Same for employer health records.

pclowestoday at 4:06 PM

I understand the concerns and I am not sure I would allow myself to be recorded until I knew more.

However, I do think we are in a situation where everybody knows that healthcare costs need to come down that doctors and medical professionals are spread too thin, forced to see evermore patients in the same number of hours, and yet for every attempt to improve efficiency there is a “no, not that way“ response.

show 11 replies
kube-systemtoday at 4:15 PM

There might be some real concern about the cognitive and patient-interaction impacts of speech recognition being used... but on the other hand, it's more likely that details are missed when information is captured manually.

And the privacy/informed consent concerns here are silly, they apply to any of your charted data... and if you're going to any office that doesn't use the latest technology, your patient information is probably being sent between offices over fax anyway.

scrawltoday at 4:07 PM

> The false promise of efficiency [...] that is extremely unlikely to mean more time with each patient. Instead, it will mean more patients.

nit: that is a real efficiency gain. seeing more patients sounds better on the face of it.

show 2 replies
pavel_lishintoday at 4:33 PM

Down for me, but Internet Archive grabbed a copy: https://web.archive.org/web/20260424151739/https://buttondow...

croisillontoday at 7:45 PM

my boss fell in love with a software that invites itself to team calls, then retranscripts and summarizes... so the whole office has that automatically in ; once in a while my colleagues and i look into that report, it's sometimes hilariously wrong, most of the times just useless

dawnerdtoday at 7:22 PM

I have been loving my new doctors recording and making everything available in the patient portal. No more trying to remember what they said. That’s huge, especially when dealing with elderly patients and being able to have their caregivers have access to it.

cromkatoday at 4:22 PM

This is seriously a good example of a domain that should enforce on-premises AI. Doctors absolutely can afford to buy an NVIDIA workstation. Transcribing text is not exactly super demanding, comparatively speaking. When did we even stop considering non-cloud services? If AI boomed 10 years ago, we wouldn't even be discussing this.

show 1 reply
apparenttoday at 6:26 PM

> But, especially given the underfunded nature of the US health system, that is extremely unlikely to mean more time with each patient. Instead, it will mean more patients.

So that means if I try to make an appt, I'll have an easier time getting one? Sounds good, I guess.

rolphtoday at 7:27 PM

a premptive notification to health care team.

"to whom may be concerned."

[Doctor Stan dinghere, as a patient i have no trust or confidence regarding the security and integrity of my personal information in regards to AI scribing.

for this reason i will scribe for you, as that is the most accurate account of what i intend to communicate with you.

i will refrain from verbal communication and will provide on the spot written communication with respect to health care interaction. ]

the_gipsytoday at 4:21 PM

I live in a country with free public healthcare. In a recent doctor's visit, the doctor was interviewing me while a nurse was typing into the computer. Presumably so that the doctor would have more time to attend patients and so that she wouldn't get distracted.

It's fascinating how this translates to the idea that in the USA, this should mean "more time with patients", but in reality also means "more patients", but is somehow bad because the is a monetary drive.

show 2 replies
dlcarriertoday at 4:19 PM

I'm more concerned about a record being made in general, than how it is made. If were to be affected by a tragedy and visit a psychologist or psychiatrist to receive support, it would likely require a diagnosis of depression to get insurance coverage, and having that on my record could make it more difficult and costly to legally fly an airplane or own a gun, and who knows what else.

show 1 reply
nubinetworktoday at 4:13 PM

Dead link (or it was?)

adit_ya1today at 4:15 PM

Almost every point follows the same structure:

> "Here is a real concern about implementation" → "Therefore you should refuse entirely"

This skips the middle step of "therefore we should implement it well."

I'm not convinced that we should be allowing doctors to record patient visits at this stage yet, but I'm really not convinced by these points, which largely don't hold up under closer examination.

A few that stuck out:

"Privacy" - Labs are routinely sent to third-party companies, and we don't do informed consent for that. The third-party argument isn't unique to recording.

"False promise of efficiency" - This doesn't really have anything to do with patients at all. It's a criticism of medical office management, not of physician-patient interactions. Telling patients to refuse a tool because management might exploit the productivity gains is asking patients to fight a labor battle on the provider's behalf.

"Consent can't be revoked mid-visit" - Consent typically can't be revoked in the middle of an appendectomy, or halfway through administering a vaccine either. Practical irrevocability is a normal feature of informed consent, not a special problem unique to recording. Proper consent processes in medical offices are a broader issue than consent about voice recordings specifically. Had the authors made the point that providers are being asked to obtain consent for tools whose technical implementation and privacy risks fall outside the provider's own domain knowledge — that would be a stronger argument. But that isn't quite the point they made, and their current framing doesn't wholly convince.

show 2 replies
daedrdevtoday at 4:25 PM

HIPPA exists and has a lot of teeth. Given this extensive liability, I trust that if anything does go wrong they will be punished. Recordings might dramatically improve patient outcomes, and so I will let them

show 3 replies
jll29today at 4:51 PM

Advice: Regardless of whether you opt in or out, you should only permit anyone to record you if you get a copy of the recording for your own records.

jcalvinowenstoday at 5:03 PM

Honestly, my recent experience with this was really positive: the doctor actually said the technical stuff out loud to me for the first time in my life, in a way I could easily ask polite questions about and discuss with them.

In my case it was something very not sensitive, removing a benign tumor in a finger, which I have no problem telling the whole world about (I was awake for the surgery and got to watch, it was a incredibly fascinating experience that I want to write more about some day).

But I can imagine it would feel much more invasive if the subject were more sensitive.

xxportoday at 4:17 PM

This was extremely unconvincing for me. The site is now 500ing for me, so I can't fully quote it, but the arguments about privacy just fall flat. You don't know about Epic's, or GE's, or Philips' security either. You have to trust the institution of HIPPA et al overall to at least make things right.

I really don't care if my recording becomes training data.

I would rather be spoken to like I'm not an idiot. Use technical terms please. I want precision.

Calling the US healthcare system underfunded might be the most wild part of the whole thing. We spend 5.3 trillion dollars a year. That's 17% of the entire economy.

show 1 reply
tristanbtoday at 5:35 PM

This is a really great way to get yourself worse care.

moralestapiatoday at 6:26 PM

Lol, this essay is missing (or starting from the assumption) that text-to-speech algorithms do a good work, even state of the art.

That is far from correct and the main reason why I would oppose to this is that the AI might incorrectly record something in the transcript that completely derails my diagnosis and treatment.

There's a big difference between:

"I have had nausea for the past three days"

and

"I have not had nausea for the past three days"

And I'm being generous with my example.

gitowiectoday at 5:59 PM

What a fucking absurd. I used to go to the shrink a lot. First it was a man, he never took any notes and he remembered everything. Second was a woman, she was taking notes but never had any problems with that!

jimt1234today at 4:29 PM

This situation is real. I've had the same doctor my entire adult life (~25 years). We've got a pretty informal relationship. I even saw her hammered at a bar one night, and had to give her a ride home because her friends were also drunk AF. Anyway, a few years ago, during an annual checkup, she asked how my family was doing and I made a joke about my brother drinking too much. A few weeks later I started receiving pamphlets in the mail about treating alcoholism, ads for rehab centers. I just brushed it off, didn't make any connection. Then, the next year, during my annual checkup, my doctor wasn't available, so I got a different doctor, someone I'd never talked to in my life. She immediately started asking me about my drinking. I fired back, asking WTF she was talking about. She said, "Oh, well your file says alcoholism runs in your family.", and then started lecturing me about getting over the shame of alcoholism is the first step to beating it. I don't even drink. No one in my family drinks other than my brother. He was drinking a lot at that time because he favorite NFL team (LA Rams) was doing really well, and he was celebrating a lot. And it was just a joke.

The next year, during my annual checkup, I gave my doctor a load of crap, telling her to record nothing I say unless I explicitly tell her to. She tried to defend the system, but she agreed. I'm still upset that my "file" still mentions alcoholism.

show 1 reply
varispeedtoday at 4:16 PM

I always agree if this is for academic purposes, if it helps with research etc. I can't see why I shouldn't. We are just meat that will expire one day.

k2xltoday at 4:12 PM

I think the post conflates two issues:

1. AI-generated charting. 2. The existence of a reliable record of the visit.

I am skeptical of the first in some cases (i.e. bias), but strongly in favor of the second.

My father is 80 and has Parkinson’s. He routinely leaves appointments unsure of what the doctor said, what changed, or what he is supposed to do next. Even when I attend with him, we sometimes disagree afterward about what exactly was recommended.

This happens with pediatric appointments too. My wife and I occasionally remember instructions differently: medication timing, symptoms to watch for, when to call back, whether something was “normal” or needed follow-up.

That is a care quality problem, not just a convenience problem.

The risks are real: privacy, consent, retention, training use, liability, and automation bias. But those argue for strict controls, not for a blanket refusal. Make it opt-in, give the patient access, prohibit training without explicit consent, keep retention short, and require clear auditability.

I do not want opaque AI quietly rewriting the medical record. But I also do not think “everyone relies on memory after a stressful 12-minute appointment” is some gold standard we should preserve.

show 1 reply
impatient_bacontoday at 4:18 PM

Oof yea I just got surprised by this at a vet appointment for my dog, weirded me out. I just went along with it to get the visit over with and I can see the benefit of having an accurate record of the visit, but we'll have to come to terms to the reality of this invasive surveillance as a society at some point I imagine.

OutOfHeretoday at 4:32 PM

Why do we even need to consult doctors anymore? Just let the AI decide. Docs should be freed up for up for doing physical tests and interventions, or otherwise for providing more training data for the AI in cases where the AI isn't producing results or when a second look is urgently needed in an emergency situation.

walrus01today at 4:09 PM

It's interesting how lots of service providers of all sorts will insist that you agree to their Terms of Service, Acceptable Use Policy, End User License Agreement (or whatever they want to call it) before engaging with you, but when the consumer insists on enforcing their own personal policy in the opposite direction such as refusing consent to recording or feeding your PII into some opaque AI system, suddenly it's a problem.

josefritzisheretoday at 4:01 PM

This is a hard no.

show 1 reply