Emotional AI Cannot Replace Empathy

emotional in 2023 Artificial intelligence technology that can detect and interact with human emotions will become one of the dominant applications of machine learning. For example, Hume AI, founded by Alan Cowen, a former Google researcher, develops tools to measure emotions from verbal, facial and vocal expressions. The Swedish company Smart Eyes recently acquired Affectiva, a spin-off of the MIT Media Lab that developed the SoundNet neural network, an algorithm that classifies emotions such as anger from sound samples in less than 1.2 seconds. Even video platform Zoom is soon offering Zoom IQ, a feature that will give users real-time sentiment and engagement analysis during a virtual meeting.

In 2023, tech companies will launch advanced chatbots that can closely mimic human emotions to create more empathetic connections with users in banking, education and healthcare. Microsoft’s chatbot Xiaoice has already been successful in China, with the average user reportedly chatting more than 60 times a month “with it”. It also passed the Turing test, and users couldn’t tell it was a bot for 10 minutes. Juniper Research Consultancy’s analysis shows that chatbot interactions in healthcare will reach 2.8 billion interactions annually in 2023, up almost 167% from 2018. This will free up healthcare staff time and potentially save approximately $3.7 billion in healthcare systems worldwide. .

In 2023, emotional artificial intelligence will become widespread in schools. In Hong Kong, some secondary schools are already using an artificial intelligence program developed by Find Solutions AI that measures micro-muscle movements in students’ faces and identifies a range of negative and positive emotions. Teachers use this system to monitor students’ motivation and focus, as well as emotional changes, enabling them to intervene early if a student loses interest.

The problem is, the majority of emotional AI is based on flawed science. Emotional AI algorithms reduce facial and tonal expressions to an emotion, regardless of the social and cultural context of the person and situation, even when trained with large and diverse datasets. For example, while algorithms can recognize and report when a person is crying, it is not always possible to accurately deduce the reason and meaning behind tears. Similarly, a frown face doesn’t necessarily mean an angry person, but that’s what an algorithm would likely conclude. Why is that? We all adjust our emotional appearances according to our social and cultural norms so that our expressions are not always a true reflection of our inner states. Often people do “emotion work” to hide their true feelings, and how they express their feelings is likely a learned response rather than a spontaneous expression. For example, women often change their emotions more than men, especially those with negative values ​​attributed to them, such as anger, because they are expected to do so.

Therefore, AI technologies that make assumptions about emotional states will likely exacerbate gender and racial inequalities in our society. For example, a 2019 UNESCO report showed the detrimental effect of gendering AI technologies with “feminine” voice assistant systems designed around stereotypes of emotional passivity and slavery.

Facial recognition AI can also perpetuate racial disparities. Analysis of 400 NBA games with Face, two popular emotion recognition software, and Microsoft’s Face API, showed that on average, Black players assigned more negative emotions, even when smiling. These results reaffirm other research showing that Black men should project more positive emotions in the workplace because they are stereotyped as offensive and threatening.

Emotional AI technologies will become more ubiquitous in 2023, but if left unquestioned and unexamined, they will reinforce systemic racial and gender biases, multiply and amplify inequalities around the world, and further disadvantage the already marginalized.

Leave a Reply

Your email address will not be published. Required fields are marked *