The article is paywalled but appears to concern abusing a cocktail of kratom, alochol, and xanax. I don't really think that's the same. Also, this feature isn't really about making ChatGPT start answering medical questions anyhow, since people are already doing that.