Friends don’t let friends make friends with AI
Virtual girlfriends and AI friends represent a toxic trend that Elon Musk is happy to exploit.
Elon Musk’s AI company posted a weird job: “Fullstack Engineer – Waifus.” The pay is up to $440,000.
“Waifu” originates as the Japanese version of the English word “wife.” It comes from anime and manga, and means a fictional female character with whom a fan feels a strong emotional attachment.
xAI is seeking an engineer to build fake “people” designed to get real people to form an emotional attachment to.
This is a classic Silicon Valley Torment Nexus product. Find a malignant trend — dark social issues, dystopian sci-fi ideas — and build it in real life to make money.
The trouble with fake people
The problem of emotional attachments to software is most pronounced among the young. Some teenagers prefer fake AI friends to real ones.
Common Sense Media research examined teen use of Character.AI, Nomi, and Replika, which provide an AI friend to talk or text with. The survey of teens aged 13 to 17 found that 72 percent have used an AI companion at least once, with half as regular users and 13 percent using daily.
A third of teens seek AI “friends” for support, friendship, emotional help, and role-playing or romantic talk. About 18 percent trust them for advice. Some find these bots as satisfying to interact with as real friends, and a third use them for serious talk—sometimes sharing details they won’t tell anyone else.
Over one in three have had at least one uncomfortable or troubling experience.
Six percent say they now spend more time with bots than friends.
Enter Elon Musk
That sad trend struck history’s richest man as a great way to make even more money.
Elon Musk’s xAI launched a new feature in the Grok iOS app called Companions. These are digital characters that people can chat with. The flagship is Ani, a gothic, blonde anime girl (presumed age unclear, but not an adult woman) who reacts with facial expressions and speaks playful, flirty lines. Users can earn points and unlock a lingerie-clad NSFW mode. The National Center on Sexual Exploitation called for Ani’s removal.
The second character is Bad Rudy, a red panda who interacts with sarcasm, insults, and even violent suggestions. Bad Rudy, by design, lacks the standard safety rails found in most chatbots and tells users to cause chaos. He even goads users to commit crimes like arson. Musk thinks it’s funny. I think it’s Musk encoded into software and cosplaying as a red panda.
Ani and Bad Rudy were only available on the Grok iOS app, where SuperGrok Heavy plan subscribers could interact with them at about $300 a month. Now, the feature is in wider release for regular Grok users too.
The social crisis
Since the Covid lockdowns and school closures, many have struggled to maintain social lives. The rise of AI chatbots and “virtual girlfriend” apps has replaced social interaction and real relationships. Introverts with social anxiety are offered an addictive “relationship” with nobody, instead of being motivated by a craving for connection to go out, take risks, and meet people.
Pace University research found that emotional ties to bots can hinder people’s relationships with actual friends or partners.
In a 2024 study at Stanford study, users reported feeling lonely even as they chatted for hours with an AI partner.
Young men in particular have flocked to these apps. About 28% of men aged 18 to 34 have tried a virtual girlfriend app or chatbot at least once, and over half of those users chat with their AI partners daily, spending around $47 a month to unlock virtual gifts and premium features.
You read that right. The fake girlfriend app companies’ business model is to confuse users into thinking AI has feelings and emotions, which they use to manipulate customers into buying non-existent gifts for non-existent people.
Real relationships can be hard. But fake relationships are easy. Chatbots are always “on,” never moody, always flattering, and echo the user’s mood. The rewards of a chatbot — endless attention, no risk of being hurt — pull users away from friends and family, and make them less willing or able to handle real friendships or romance.
It’s bad. I resent Elon Musk for pursuing this business.
The Machine Society Interview: Dr. Claire Robertson
Dr. Claire Robertson studies online extremism and polarization as an assistant professor at Colby College in Maine. We talked about why social media can be so harmful, and what can be done about it. (This interview is available exclusively in this issue of Machine Society.)
More From Elgan Media, Inc.
When everything is vibing
Send in the clones
The one secret to using genAI to boost your brain
Where’s Mike? Mexico City!
(Why I’m always traveling.)
This reminded me of this story. Truly terrible. https://apnews.com/article/chatbot-ai-lawsuit-suicide-teen-artificial-intelligence-9d48adc572100822fdbc3c90d1456bd0