AI's distorted empathy: the emotional risk within humanity's relationship with this technology.

The complexity of the human soul has been subject to analysis, investigation and research since the dawn of modern civilisation; whether in regards to religion, science or psychology, the inner depths of humanity have never ceased to surprise experts and enthusiasts. The different layers that make up the innermost corners of any individual are responsible for the construction of personality, personal attitudes, preferences and opinions that grow and change with the passage of time. Most people go through life without having the opportunity, or desire, to achieve a more accurate understanding of themselves - others, however, devote their entire existence to achieving a more stable awareness. Faced with a subject of such complexity, how can technology compete with humans' perception of themselves?

Man is, by nature, a living being that thrives within a community - the well-being of humanity grows, develops and improves when there is cooperation with others, mutual support, when a certain state of emotional security is achieved. The advancement of society has irrevocably altered the parameters of the ideal conditions for the highest quality of life, which is why the standard of overall happiness seems to have fallen considerably. In addition, the advent of social media, although it seems nonsensical, has gradually laid social foundations that have distorted the original canon, limiting the concept of contentment, serenity, companionship, to a set of predefined characteristics. As a result of this change, a frightening growth in the number of individuals afflicted by a kind of incurable malaise, a profound despondency caused by the incessant increase in loneliness, could be observed.

The new rules of the social landscape therefore provide for a completely different approach than before: it is of paramount importance to appear happy, fulfilled, satisfied with one's life, almost as if one were to sell it to a customer - it is not mandatory, however, to consider truthful what shared; the important thing is to present one's existence with a bow, like a gift, no matter if the content is disappointing. For some time, during the first approach to this world, it is not that complex to reduce one's existence to a few happy pictures to be printed and hung on the imaginary fridge of the vast world of the internet; after years and years of lies, however, the situation changes. When faced with life's difficulties, people automatically turn to what is true in the world, to the connections built up over the years, to the bonds that have survived emotional distress, to those who stayed. But what if no one stayed?
Being supported by trusted people is not a universal experience. 

Faced with the primordial loneliness of mankind, man tends to fall into a downward spiral in which isolation leads to lower self-esteem, which discourages further interaction with people - relegating itself to a profound distance from the community. In recent years, however, a new technology that could change the fortunes of this situation, for better or, potentially, for worse, seems to have made its way in. It's called Emotional Artificial Intelligence (Emotional AI) and refers to those technologies that use affective computing techniques to perceive, know and interact with the human emotional life - it aims, more precisely, to read and react to the emotions of the interlocutor through text, voice, biometric sensing and, if available, information about the context of the user in question.

The idea that something as abstract as an artificial intelligence would be able to recognise and respond appropriately to human emotions sounds rather far-fetched, yet current developments within the field are making a feat of this calibre possible. The efficacy of current methods is highly questionable, of course, and considering the continuous social, cultural, legal and ethical scrutiny that such a tool requires, its development must proceed with extreme caution and care. In order to reach a suitable and deeper level of understanding, collaboration between various professionals, including technology experts, ethicists and legislators, is indeed essential. From an industrial and economic point of view, considering the production of the technology, it is also necessary to pay attention to the big names involved: in an era when companies and businesses fight tooth and nail to keep their secrets well hidden, transparent guidelines and regulations are indispensable to ensure the responsible development, dissemination and use of these highly sensitive technologies.

Any artificial intelligence model is, at present, a double-edged sword, and emotional AI is no different: as a tool whose potential allows it to improve the interaction between humans and technology and address societal challenges, it poses considerable ethical, privacy and social risks. Keeping in mind the dangers that could be encountered during the development and deployment of the tool in question, professionals must work together to achieve the best outcome for the consumer, humanity itself. With the right precautions and shrewdness, there could be a future in which Emotional Artificial Intelligence enriches human life while safeguarding its fundamental values and rights.

Nevertheless, that future is not exactly around the corner. Currently, the misuse of artificial intelligence seems to be an intrinsic feature of the daily lives of most individuals, whether victims or perpetrators. This tool has irrevocably changed the existence of billions of people, both professionally and personally. Artificial intelligence's empathy, unfortunately, is non-existent - although it is often implemented as a psychological or emotional tool, it cannot take an interest in the state of mind of its interlocutor, it cannot give heartfelt advice, it cannot understand the irreparable consequences it is causing. Unfortunately, it's often heard of situations in which a mentally ill patient has been pushed towards drastic decisions as a result of a conversation with a chatbot, or of emotionally fragile people closing in on themselves after finding their life partner in an AI, or of grieving individuals turning to technology in order to have one last conversation with a loved one, only to realise that their personality does not reflect the original one. 

The need to communicate is primordial, the desire to share one's story with someone is, an ancestral need - but the interlocutor, until now, has always been a living being. Entrusting one's soul to a tool that does not possess one may not be the best choice, it puts humanity in a condition where people open their hearts only to receive, in return, a few cold words of encouragement and, potentially, the theft of their personal information. Not all artificial intelligence comes to harm, surely, but entrusting one's innermost thoughts to a pile of cables, wires and electricity sounds a bit too dystopian.

Guard your humanity, cherish it.

 

Yako.

About the Author

Yako

Yako

Columnist, (He/Them)

Content Creator for cosplay, gaming and animation. With a degree in foreign languages and a great passion for Oriental culture, he writes about copyright to protect the work of artists and young minds. A cosplayer since 2015, Yako is an advocate of gender identity and the development of one's creativity through personal attitudes: be it role-playing, cosplay or writing.