logoalt Hacker News

magnetowasrighttoday at 4:25 AM0 repliesview on HN

The wilful ignorance and total apathy is appalling.

I've had similar experiences in Australia. I emailed one of my docs' practices asking if they use Heidi AI (or anything similar) and that I do not consent. They were using it without my consent.

In the consultation, he tried to give me the schpiel, including the 'it stays local' thing. The Heidi AI website has the scripts for clinicians; he ran through them all.

Oh, their documents for clinicians also mention every two sentences that patient/client consent is not required at all. I wonder why they keep saying that? Hmm.

This doctor knows I am a developer. When I asked him to explain what he meant by 'local data', he said the servers were in Australia. I almost flipped the desk. Aside from the fact that it is mandatory (it's the law! they do not have a choice!), it's ...kind of meaningless where the servers are, especially when he (on behalf of Heidi AI) was trying to sell it as a security or privacy feature. When I pointed that out, he just couldn't wrap his head around it. Of course he can't, he doesn't understand.

AHPRA's "Meeting your professional obligations when using Artificial Intelligence in healthcare" guideline[0] (not any kind of enforceable requirement, unfortunately) has great stuff in it. It encourages using it with the informed consent of patients. Even if my doctor read it and agreed with it, and cared about getting consent, how the hell can he inform patients sufficiently when he has absolutely no idea about, well, anything?

He keeps pushing it and asking me about whether I've changed my mind about allowing him to use it. No! He keeps asking me questions that only confirm he hasn't even done a perfunctory web search about why some people hate LLMs, especially in the context of PII and PHI.

I really do feel for clinicians, but these products are not the answer.

[0] https://www.ahpra.gov.au/Resources/Artificial-Intelligence-i...