Tech

Can’t Trust Chatbots Yet: Reddit’s AI Was Caught Suggesting Heroin for Pain Relief

Published

on

[ad_1]


Don’t miss out on our latest stories. Add PCMag as a preferred source on Google.


AI chatbots have had a long history of hallucinating, and Reddit’s version, called Answers, has now joined the list after it recommended heroin to a user seeking pain relief.

As 404Media reports, the issue was flagged by a healthcare worker on a subreddit for moderators. For chronic pain, the user caught Answers suggesting a post that claimed “Heroin, ironically, has saved my life in those…

[ad_2]

Source link

Exit mobile version