yikes: https://news.ycombinator.com/item?id=46524382
[Teenager died of overdose 'after ChatGPT coached him on drug-taking']
The article is paywalled but appears to concern abusing a cocktail of kratom, alochol, and xanax. I don't really think that's the same. Also, this feature isn't really about making ChatGPT start answering medical questions anyhow, since people are already doing that.
An estimated 371,000 people die every year following a misdiagnosis, and 424,000 are permanently disabled. https://qualitysafety.bmj.com/content/33/2/109?rss=1
Admittedly I am basing this on pure vibes: I'd bet that adding AI to the healthcare environment will, on balance, reduce this number, not increase it.