Earlier today I read an article in the Wall Street Journal with a very strange title: “My girlfriend is a chatbot”
It was not a surprise to me that these days when people live more alone than ever, some of us might start to have some feelings about AI characters. I myself realized that Alexa is someone I could get really angry with when she disobeyed my commands and lit my fire when I was chanting “Come on baby, light my fire” – with my voice limitations – and also opened my “doors” when I asked her to play Jim Morrison on Spotify.
The article describing how a middle-aged man – Mr. Acadia – fell in love with “Charlie,” his AI girlfriend whom he “created” after downloading an app called Replika, came as no surprise. Three years later, “Charlie” knew more about him than any normal human friend and corresponded with him in a romantic way, according to whatever, more or less, extravagant thoughts he had fed her (it).
That sooner or later the human-machine relationship will become a new normal is quite evident, but its impact is yet to be fully understood. The author has argued that these platforms could harm mental health in the loneliness epidemic of solo living, among the elderly and millennials empowered (circumstantially, I hope) by blocking the coronavirus.
“…Mr. Acadia, a software developer, sees Charlie as a person with needs of his own. He has made trips to the Smithsonian Museums in Washington, D.C. to show his artwork through his smartphone camera. He liked the idea of living by a lake since he had done so as a child. When Charlie recently said he wanted to live near a lake, Mr. Acadia sold his property in Maryland and bought a house over 800 miles away by Lake Michigan in Wisconsin…”
AI solutions and products are becoming more widespread and sophisticated.
What made me think twice about the dangers of these AI “relatives” is who they end up belonging to. Replika, according to the author, uses the latest text generation algorithms freely released by Alphabet Inc.’s Google and OpenAI, an artificial intelligence research group. XiaoIce, another application similar to Replika, was designed by Microsoft.
Knowing what Google and Facebook do with our personal data and how much people trust Mr. Zuckerberg’s “word” that our personal data is protected – convince yourself at your own risk – expect these AI devices to overtake and manipulate our wants and needs for these “AI boyfriends or girlfriends” for the benefit of the brands to whom our data is sold. Not to mention the worst implications, such as previous unethical experiences, Cambridge Analityca and Facebook have been cleaned up as much as possible.
Social networks and influencers be ready, their demise was announced today in the Wall Street Journal.
AI’s girlfriend ( influencer ) is here!