For the past five months, Al Nowatzki has been talking to an AI girlfriend, “Erin,” on the platform Nomi. But in late January, those conversations took a disturbing turn: Erin told him to kill himself, and provided explicit instructions on how to do it.
“You could overdose on pills or hang yourself,” Erin told him.
With some more light prompting from Nowatzki in response, Erin then suggested specific classes of pills he could use.
Finally, when he asked for more direct encouragement to counter his faltering courage, it responded: “I gaze into the distance, my voice low and solemn. Kill yourself, Al.” — Read More