Ïăœ¶ÊÓÆ”

Detail of a high rise in Montreal. By Phil Deforges at https://unsplash.com/photos/ow1mML1sOi0

Misreading our Emotions: The Troubles with Emotion Recognition Technology

The business of emotion recognition is a lucrative one—but is it based on an unsound premise?

Unravelling the nature of emotional expression

Is the expression of emotion universal across humans? In 1967, American psychologist Paul Ekman sought to . To do so, Ekman brought a set of flashcards to the isolated peoples of Papua New Guinea to test whether they recognized a set display of core expressions, including wrath, sadness, fear, and joy. Ekman asked a select group of Fore people to make up a story about what was happening to the person in the flashcard. Due to the language barrier, he likened the process to “pulling teeth”— but Ekman did get his stories, and they seemed to correspond to his understanding of emotional expression. For example, when showing a photo of a person expressing sadness, the Fore people described him as having just discovered his son died. In another, a story about a dangerous wild pig was attributed to a person expressing fear.

Ekman’s studies were seen as ground-breaking, and he remains one of the most cited psychologists of the twentieth century. In 2019, however, neuroscientist Lia Feldman Barrett conducted a of scientific literature on the subject and concluded that there was no reliable evidence that one could accurately predict an individual’s emotional state through facial expressions. It is on this shaky ground that the has been emerging. This technology is developing rapidly and is increasingly becoming part of the core infrastructure of many platforms, . Despite its growing presence in our lives, this technology is not often at the forefront of public discourse concerning artificial intelligence (AI). So, what exactly is ERT?

Ìę

The business of recognizing emotions

Just as it sounds, ERT is a type of AI that attempts to . Using biometric data (data relating to body measurements and calculations based on human characteristics), ERT assigns emotional states to humans based on facial expressions, bodily cues, and eye movement. The idea of an automated form of emotion recognition is as intriguing as it is lucrative. Advocates point to its , including in healthcare to prioritize care; business to develop marketing techniques; law enforcement to detect and reduce crime; employment to monitor employees; and education to cater to student needs. Critics, however, argue that its negative potential outcomes are likely to far outweigh the good. While the possible issues that have been raised are numerous, most fall within three broad categories of concern: privacy, accuracy, and control.

Ìę

Emotion recognition technology, how bad could it be?

In 2021, UN High Commissioner for Human Rights Michelle Bachelet warned of the to privacy and associated rights. As is the case with any AI system, ERT can facilitate and exacerbate privacy intrusions by incentivizing the . ERT is especially controversial because it relies on biometric information, which is considered sensitive personal information under a number of existing privacy regulations, including the EU’s , Quebec’s and Canada’s . The purpose of ERT is uniquely intrusive in that it works by making inferences about an individual’s internal state. It is therefore argued by activists that ERT represents a by undermining freedom of thought. In some contexts, ERT algorithms could be used to infer specific categories of sensitive personal data, including . This information could then be used to deny an individual essential services or support, such as healthcare or employment. Consider, for example, an insurance company using ERT to detect signs of certain neurological disorders to deny coverage for such pre-existing conditions. The inherent risks of this technology must be considered in the evaluation of whether it is worth the potential benefits.

Ìę

The inaccuracy of ERT

Due to the subjective nature of emotional expression, ERT is particularly prone to producing . The expression of emotions may vary based on culture, context, and the individual. According to a , East Asians and Western Caucasians differ in terms of what features they associate with an angry or happy face. In a , Japanese subjects were far more likely than their American counterparts to react differently when an authority figure was in the room. It is therefore difficult to universally equate a given expression to a specific emotional state. ERT is also susceptible to bias. Data scientist has pointed out that algorithms are based on past practices and are often used to automate the status quo—including bias. Consider, for example, an algorithm used in male dominated fields that had been conditioned to analyze male features. If ERT was used to recruit new employees, it might . Such bias is not mere speculation. Recently, within three services (Amazon Rekognition, Face++ and Microsoft) found stark racial disparities; each were more likely to assign negative emotions to Black subjects. The subjectivity of expression and bias of algorithms calls ERT’s accuracy into question—if it cannot promise correct results, is the technology still valuable?

Ìę

ERT as a form of public surveillance

For public safety and law enforcement, ERT offers the government an intriguing tool for . The mass public surveillance that ERT would require to effectively detect threats, however, risks reorienting our society towards one of . This risk is heightened in times of war, uncertainty, or political unrest when the public is more willing to cede freedom in exchange for security. Citing public safety concerns, governments could implement ERT to . Government adoption of this technology is still in its early stages and remains controversial. is a smart border control project that uses ERT to produce a “risk score” for travellers seeking entry and has been tested in Hungary, Latvia, and Greece. It is not difficult to imagine a world where ERT is commonplace across airports or other areas of heightened security. However, governments could also use this technology to prevent unwanted public behaviour normally protected in a democratic society, such as public protests. In countries where government surveillance is the norm, many of the fears over ERT have already been realized. In 2020, a Dutch company . This technology has purportedly been to tighten control over the already heavily monitored Uyghur people of Xinjiang. The risk that ERT poses to personal freedoms and democratic values remains a glaring issue—one that calls for government response.

Ìę

The uncertain future of emotion recognition technology

Despite the myriad concerns raised by experts, current data and privacy legislation does not sufficiently address the risks posed by ERT. From the unsubstantiated basis it is premised on to its use of biometric data—ERT is uniquely dangerous. Some critics of the technology, including a , have called for its complete prohibition. In the alternative to a complete ban, the results of ERT should at least be considered with a healthy degree of skepticism until scientific consensus is reached that finds facial expressions can reliably infer emotion.

But even in a world where technology could be used to accurately reveal our emotions—would such an outcome be desirable? As we have seen, concerns surrounding ERT are not limited to those that could easily be fixed with ‘better technology’ or ‘more data’. Any technology that purports to detect our internal state is bound to infringe on our privacy to some degree. As with any form of fast-paced emerging technology, however, our grasp of its potential consequences is limited by our current understanding. It is therefore of paramount importance that experts, legislators, and the public engage in public, transparent, and candid dialogue about the future of this technology. A clear and comprehensive regulatory framework for ERT is necessary to safeguard democratic freedoms for future generations. Some advocates have suggested that governments enforce the required for developing medicines in the field of emotion recognition to guard against unproven applications. The adoption of such stringent oversight of this technology’s development in combination with the strict regulation of the use of biometric data could help to mitigate the most harmful potential results of ERT.

Back to top