Does it matter if your friends are real?
When you’re talking to AI, or to a digital “person,” you’re talking to nobody. Is this a “social” act? What’s the difference?
Common Sense Media released a report this week about the impact of AI on children, teens, and families. The report, called "The Dawn of the AI Era,” said their survey found that a minority of teenagers use AI as a “friend.”
“Some teens also look for companionship when interacting with generative AI tools; 15% note that they have used gen AI to keep them company.”
Another sign of the times is the publishing by developer Michael Sayman on iOS of a “social network” app where you, the user, is the only human on the system. Called SocialAI, the app simulates human interaction on a Twitter-like “social network” where you get to “be the main character.”
You make posts, and bots comment and share and engage, and you can have conversations with it those bots.
Users can choose the kinds of engagement they want, selecting from a menu that includes trolls, haters, drama queens, etc. Sounds compelling.
SocialAI could have many uses—role-playing as an “influencer”—using bots to draw out one’s own thoughts and feelings as a kind of journal. But on its face, it’s a social network for people who want to do social networking but don’t care if they’re interacting with humans.
Another strange phenomenon I’ve commented on before is that “virtual influencers” have large fan bases who engage with them on social media. There are many, but one of the original digital “people” on social is Lil Miquela.
Other virtual influencers, many created with AI tools, are Lu do Magalu (@magazineluiza), Rozy (@rozy.gram), Ailynn (@ai_ailynn), Aitana (@fit_aitana), Imma (@imma.gram), Milla Sofia (@millasofiafin), Any Malu (@anymalu_real), Nobody Sausage (@nobodysausage), Noonoouri (@noonoouri), K/DA (@kda_music), Thalasya (@thalasya), Qai Qai (@realqaiqai), Kyra (@kyraonig), Shudu Gram (@shudu.gram), Bermuda (@bermudaisbae), Blawko (@blawko22), Liam Nikuro (@liam_nikuro), Zoe Dvir (@zoedvir), Seraphine Song (@seradotwav), Ronald F. Blawko (@blawko22), Mar.ia (@soymar.ia), Teflon Sega (@teflonsega), CodeMiko (@codemiko), Yameii Online (@yameiionline), Zinn (@plusticboy), APOKI (@iamapoki), Chill Pill (@iamchillpill), and Ilona (@ilona).
Followers and fans actively cultivate parasocial relationships with these fake people, presumably knowing they’re fake. They like and comment as if there’s a person reading the comments. Presumably, these commenters know, but don’t care, that they’re engaging with someone who doesn’t exist.
This is just the tip of the iceberg of AI-based substitute humans.
Meta is rolling out an Instagram feature called AI Studio that enables influencers to offload the chore of interacting with fans to bots that imitate their style. Is engaging with a famous person’s bot really a connection with the person? Why do it?
A new industry of therapy apps is emerging where you pour your heart out to a chatbot, which gives you advice. Is interacting with nobody really the road to mental health?
Roboticists propose humanoid or pet-like robots for the lonely elderly. Is a machine really what people craving human connection need?
Shameless Self-Promotion
What happens when everybody winds up wearing ‘AI body cams’?
What North Korea’s infiltration into American IT says about hiring
Apple embraced Meta’s vision (and Meta embraced Apple’s)
More from Elgan Media!
My Location: Essaouira, Morocco
(Why Mike is always traveling.)
I actually think that AI therapy could be a very good thing. Doctors and therapists basically use decision lists to diagnose and treat ailments.
Imagine your therapist could use intervene using so much data to give you the best treatment— instead of just billing you for endless hours of talk therapy.