logoalt Hacker News

jmyeetyesterday at 11:20 PM0 repliesview on HN

I'm reminded of the Air Canada customer service chatbot. It completely made up a refund policy (and there are still people on HN who insist LLMs don't hallucinate) and a court ruled the company had to honor it [1].

The only way to deal with this is to make the implentation not worth it by constantly bypassing it to speak to a human and/or making it cost money by getting it to give you things you're not otherwise entitled to.

I really wonder how these things will handle prompt injection and similar things. I have no confidence any of this is secure.

Wait until this comes to healthcare and it'll be chatbots handling appeals to prior authorization denials, wasting even more physician time.

[1]: https://www.wired.com/story/air-canada-chatbot-refund-policy...