The brief flurry of AI-powered wearables like the Humane AI Pin and the Rabbit R1 doesn’t seem to have caught on the way their creators hoped, but one seems to be banking on the idea that what we really want from an AI companion is non-stop drama and traumatic backstories. Friend, whose pendant hardware isn’t even out yet, has debuted a web platform on Friend.com to allow people to talk to random examples of AI characters. The thing is, every person I and several others talked to is going through the worst day or week of their lives.

Firings, muggings, and dark family secrets coming out are just some of the opening gambits from the AI chatbots. These are events that would lead to difficult conversations with your best friend. A total stranger (that you’re pretending is human) should not kick off a possible friendship while undergoing intense trauma. That isn’t what CEO Avi Schiffmann highlights in the video announcing the website, of course.

Dramatic AI

(Image credit: Future)

You can see typical examples of the AI chatbots opening lines at at the top of the page and above. Friend has pitched its hardware as a device that can hear what you’re doing and saying and comment in friendly text messages. I don’t think Craig is in any position to be encouraging after getting pistol-whipped. And Alice seem more preoccupied with her (again, fictional) issues than anything going on in the real world.

Friend.com AI

(Image credit: Future)

These conversations are textbook examples of trauma-dumping, unsolicited divulging of intense personal issues and events, Or, they would be if these were human beings and not AI characters. They don’t break the illusion easily, however. Craig curses at me for even suggesting it. Who wouldn’t want these people to text you out of the blue as Schiffmann highlights.

Future Friends?





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *