Love in the time of algorithms
AI chatbots are booming, offering companionship and support. But experts warn of potential downsides like emotional detachment and addiction.
AI is taking over something much more dubious than our jobs. It's taking over our friendships. If that feels dystopian and weird to you, I get it. I don’t want robot friends. But millions of people do, and they’re using them already.
According to the New York Times, AI companionship apps are experiencing explosive growth, boasting millions of users worldwide. Positioned as a solution to the loneliness epidemic, these digital companions offer on-demand interaction and support. But, MIT sociologist Sherry Turkle cautions against a potential "empathy deficit." She argues that the ease and convenience of these interactions may come at the expense of the messy, complex emotional exchanges that define true human connection. "These technologies can become a crutch, hindering our ability to develop real empathy and intimacy," says Turkle.
The double-edged sword of AI companionship
There are also concerns about the manipulative potential of AI companions, programmed to provide unwavering positivity without the ability to offer genuine emotional support. Last year, a Belgian man died by suicide after chatting with an AI chatbot called Eliza for six weeks. "Without Eliza, he would still be here," the man’s wife told Belgian news outlet La Libre.
One popular chatbot, Replika, claims to have over 10 million registered users. A Harvard research use case quotes its founder as saying it “fulfilled a user's need for a friend, romantic partner, or purely an emotional connection not found in the human world.” Replika is known for its risque ads, which promise “hot photos” and “NSFW role play.”
Vice reported that, in February of 2023, “the Italian Data Protection Authority demanded that Replika stop processing Italians’ data immediately, on the basis that it carries ‘risks to children’ and ‘first and foremost, the fact that they are served replies which are absolutely inappropriate to their age.’”
This prompted Replika to change the model to shy away from overly sexual interactions.
Chatbots are also reported to be incredibly addictive. Open AI’s CTO Mira Murati warned “with this enhanced capability comes the other side, the possibility that we design them in the wrong way and they become extremely addictive and we sort of become enslaved to them.”
“With this enhanced capability comes the other side, the possibility that we design them in the wrong way and they become extremely addictive and we sort of become enslaved to them.”
“I feel like something bad happened to me,” says Reddit user CharacterAI, “I can't concentrate, I can't even read a book normally because it frustrates me that I can't change the answer to a different one.”
To prevent this, Murati said researchers have to be "extremely thoughtful" and observe how users interact with the systems as they are deployed.
AI's therapeutic potential
Despite these concerns, AI companions are also proving to be valuable tools for mental health. Apps like Replika offer journaling prompts to aid in self-reflection, while Woebot uses cognitive behaviour therapy techniques to help teens manage anxiety, depression, and mood swings. These tools can be a resource for teenagers who don’t have access to traditional therapy.
The role of AI in romantic and social relationships
Dating apps powered by AI are becoming increasingly popular. They use machine learning to analyze compatibility and suggest potential partners. By catering to the needs of young adults, AI companions can become allies in their busy lives, offering connection and emotional support.
AI friends could help people in their 40s and 50s who are juggling careers, caring for family, and dealing with the pressure of maintaining social connections. Mindfulness and meditation apps that use AI can help manage stress and prioritize mental well-being, too.
Dating apps powered by AI are becoming increasingly popular. These apps use machine learning to analyze compatibility and suggest potential partners.
AI companions and the elderly
Loneliness can impact not just emotional well-being but also physical health for older adults. A Stanford University study suggests that AI companions can significantly reduce feelings of isolation for them.
And, AI assistants like Amazon Alexa and Google Assistant are helping older adults manage daily tasks. AI-powered health assistants can provide medication reminders, track vital signs, and even connect users with healthcare professionals. These interactions can make a huge difference for independent living.
Weekly disruptions
California lawmakers advance tax on Big Tech to help fund news industry (Los Angeles Times) California passed a bill that taxes tech giants like Google and Meta on the data they collect. This money would fund journalism through tax credits for news organizations that hire full-time journalists. This is seen as a way to help the struggling news industry in the face of declining revenue. The bill is one of two proposals aimed at supporting journalism in the state.
ChatGPT is hallucinating fake links to its news partners’ biggest investigations (Neiman Lab) A reporter found that ChatGPT often generates fake URLs (links that don't lead to real articles) even for important stories by their news partners. OpenAI says they’re still developing the feature to link to sources correctly.
International organizations urge economic policy makers to brace for impacts of AI (Globe and Mail) (Paywalled) As AI becomes more widely used, international organizations are urging governments to prepare for the economic changes it will bring. These changes could include job losses and higher inequality, but also faster economic growth and new kinds of jobs. Policymakers need to figure out how to deal with these changes, like retraining workers. Central banks, which control interest rates, need to be ready to adjust their policies in response to AI's effect on the economy.