AI isn’t great at decoding human emotions. So why are regulators targeting the tech?

AI isn’t great at decoding human emotions. So why are regulators targeting the tech?

In addition to proposing the theory of evolution, Darwin studied the expressions and emotions of people and animals. He debated in his writing just how scientific, universal, and predictable emotions actually are, and he sketched characters with exaggerated expressions, which the library had on display.

The subject rang a bell for me. 

Lately, as everyone has been up in arms about ChatGPT, AI general intelligence, and the prospect of robots taking people’s jobs, I’ve noticed that regulators have been ramping up warnings against AI and emotion recognition.

Emotion recognition, in this far-from-Darwin context, is the attempt to identify a person’s feelings or state of mind using AI analysis of video, facial images, or audio recordings. 

The idea isn’t super complicated: the AI model may see an open mouth, squinted eyes, and contracted cheeks with a thrown-back head, for instance, and register it as a laugh, concluding that the subject is happy. 

But in practice, this is incredibly complex—and, some argue, a dangerous and invasive example of the sort of pseudoscience that artificial intelligence often produces. 

Certain privacy and human rights advocates, such as European Digital Rights and Access Now, are calling for a blanket ban on emotion recognition. And while the version of the EU AI Act that was approved by the European Parliament in June isn’t a total ban, it bars the use of emotion recognition in policing, border management, workplaces, and schools. 

Meanwhile, some US legislators have called out this particular field, and it appears to be a likely contender in any eventual AI regulation; Senator Ron Wyden, who is one of the lawmakers leading the regulatory push, recently praised the EU for tackling it and warned, “Your facial expressions, eye movements, tone of voice, and the way you walk are terrible ways to judge who you are or what you’ll do in the future. Yet millions and millions of dollars are being funneled into developing emotion-detection AI based on bunk science.”

Source Link

What happened to the microfinance company Kiva? « Previous What happened to the microfinance company Kiva?
Next » The Download: corporate presentations, and carbon removal funding The Download: corporate presentations, and carbon removal funding

Latest posts