OpinionArtificial relationships: Will AI replace love in the near future?
Artificial relationships: Will AI replace love in the near future?
“While Generative AI models simply mimic humans and are not sentient in any way, it is becoming increasingly common for people to develop emotional connections with AI models such as ChatGPT,” writes Adir Ron, CTO at Microsoft for Startups
In the past few months, we’ve seen an unparalleled rise of Generative AI models with many of them trained to create new and creative content, such as music, images, and videos. As most of the training data behind those new models was taken from our documented history, this is the first time AI models were trained in emotionally expressive art on such a large scale, meaning they can convey various emotions such as happiness, sadness, anger, and others.
One of the ways that Generative AI models can perceive emotions is through deep learning algorithms that analyze large amounts of data from various sources in order to identify patterns and recognize emotions. This triggers a very interesting question - if humans were able to express emotions in the data they produced that is now at the heart of those models, can AI models trained on human emotions be considered emotional as well?
To start addressing this question, we probably need to go down the rabbit hole of whether human emotions can be captured and articulated as well as other forms of data, and whether trying to “mathematicize” concepts like love or loss will always fall short from how humans experience it. On the one hand, Generative AI models were designed to interact with humans, and therefore have the ability to recognize human emotions, respond accordingly, and even mimic them. They can generate content that is designed to evoke specific emotions, such as happy images or sad songs and on a larger scale - can use deep learning algorithms and vast amounts of data to learn and understand the various emotional expressions that different cultures may have.
At the same time, it’s important to note that while the content they produce may be very hard to differentiate from human art, AI Models are based purely on algorithms and data, and have no true consciousness or self-awareness. They don’t have many of the engines that drove art in our history such as memories or real-life experiences, as they exist only in the digital realm. Also, we are still behind on the technology to fully capture and reproduce the biological and neurological mechanisms necessary for experiencing emotions and therefore cannot truly feel or understand them.
One example we can explore is DALL-E 2, the ingenious AI image generation technology from OpenAI. While DALL-E 2 is not capable of truly experiencing or feeling love, it has been designed to recognize and replicate patterns that are commonly associated with emotions in human art. This means that the images generated by DALL-E 2 that express love are not a reflection of the AI model's own emotions, but rather a representation of love as it has been perceived and translated by human creators. These images may be visually similar to human art that expresses love, making it very hard for humans to understand what kind of art was created from the artist’s own emotions and experiences and what was created based on statistics and data science.
OpenAI’s ChatGPT, which recently set the record as the fastest growing web platform in history, is another interesting test case. At its core, ChatGPT is a type of natural language processing AI model that is designed to generate human-like text responses in real-time. In movies like Her, we’ve seen humans fall in love with ChatGPT-like models but it’s clear to everyone that has experienced this service that it’s quite far from the story that is portrayed in the movie. Love is a complex emotion that is based on many factors, including physical attraction, shared memories, and personal experiences based on common interests. These are all things that ChatGPT models lack, despite their advanced language skills and ability to mimic human conversation.
Having said that, even if we are still quite far from Hollywood’s fantasies, there are already some interesting angles we can explore. For example, there are already tools that allow usage of ChatGPT in popular dating applications like Tinder. If the complete virtual interaction with a potential love interest was based on ChatGPT answers (and DALL-E 2 pictures, if we want to take this to the extreme) and the other party is interested enough to move forward for a physical meeting – who actually created that initial infatuation?
Even if we disregard this example as a case of “fraud” as there was a user behind the scenes that used ChatGPT, there could be cases when someone would opt-in to this interaction fully aware of ChatGPT being an AI-based model. If the person is isolated and has few social connections in real life, they may begin to see the AI model as a source of comfort and companionship. In other cases, if a person is seeking advice or guidance from the AI model and receives helpful and supportive responses, they may begin to trust and rely on the model as a trusted source of support. Those emotional connections could start fostering a sense of attachment and dependency which in many cases could be described as a form of close friendship or possibly even love. As we already acknowledged in this article, we can’t “mathematicize” love, which in this case can also mean we can’t really set a boundary or fully understand how those AI-based interactions would make humans feel.
At the end of the day, it’s quite clear that as this technology continues to advance, it will continue to rely heavily on human-based content for its training, encapsulating many ways that emotions like love were documented throughout history. While Generative AI models simply mimic humans and are not sentient in any way, it is becoming increasingly common for people to develop emotional connections with AI models such as ChatGPT. From feelings of dependency when those models provide companionship and joy to feelings of disappointment and disillusionment if they fail to provide the support and comfort that the person needs – this is just the beginning. The relationship between humans and AI models is just taking its first steps, and while it is such an exciting journey, like any other technology – we should also be aware of where that path could potentially take us in the future.
Adir Ron is the CTO of Microsoft for Startups