Now we fret about chatbots. An earlier age worried about automatons, the uncanny humanoid contraptions whose voices could trigger love or mania.
Now we fret about chatbots. An earlier age worried about automatons, the uncanny humanoid contraptions whose voices could trigger love or mania.
One of today’s most popular artificial intelligence apps is Replika, a chatbot service whose users — many millions of them — converse with virtual companions through their phones or on VR headsets. Visually, the avatars are rudimentary. But each Replika offers personal attention and words of encouragement, and gets better at it with each update. There are dozens of A.I. services like this now: imitation humans who promise, via text or voice, to console, to understand, to adore.
Many users (men and boys, mostly) are developing long-term bonds with these simulated lovers (“women,” mostly). Some fall into ruin. A young man in Britain tried to assassinate the former queen after plotting with a Replika avatar; and last month a mother filed a lawsuit against Character.AI, another of these apps, after her son killed himself with the encouragement of his virtual “girlfriend.”