[ad_1]
It is a technology that has been frowned upon by ethicists: now, researchers hope to unmask the reality of emotion recognition systems in order to revive public debate.
The technology designed to identify human emotions using machine learning algorithms is a huge industry, claiming it could prove useful in a myriad of situations, from road safety to market research. But critics say the technology not only raises privacy concerns, but is inaccurate and racist.
A team of researchers has created a website – emojify.info – where the public can try out emotion recognition systems using their own computer cameras. One game focuses on pulling faces to trick technology, while another explores how such systems can struggle to read facial expressions in context.
Their hope, the researchers say, is to raise awareness of the technology and promote conversations about its use.
“It’s a form of facial recognition, but it goes a step further because rather than just identifying people, it claims to read our emotions, our inner feelings on our faces,” said Dr Alexa Hagerty, project manager and researcher at the University of Cambridge Leverhulme. Center for the Future of Intelligence and the Center for the Study of Existential Risk.
Facial recognition technology, often used to identify people, has come under intense scrutiny in recent years. Last year, the Equality and Human Rights Commission said its use for mass screening should be stopped, saying it could increase police discrimination and undermine freedom of expression.
But Hagerty said many people were unaware of how common emotion recognition systems were, noting that they were used in situations ranging from hiring, to customer analysis work, to security. airports and even education to see if students are engaged or doing their homework.
Such technology, she said, was in use all over the world, from Europe to the United States and China. Taigusys, a company specializing in emotion recognition systems and headquartered in Shenzhen, says it has used them in environments ranging from nursing homes to prisons, while according to reports earlier this year, the Indian city of Lucknow plans to use the technology to spot the plight of women following the harassment – a move that has drawn criticism, especially from digital rights organizations.
Although Hagerty said that emotion recognition technology could have potential benefits, these must be weighed against concerns about accuracy, racial bias, as well as whether the technology was even the right one. tool for a particular job.
“We need to have a much broader conversation and public deliberation about these technologies,” she said.
The new project allows users to try out emotion recognition technology. The site notes that “no personal data is collected and all images are stored on your device.” In a game, users are asked to make a series of faces to simulate emotions and see if the system is fooled.
“The claim of the people who are developing this technology is that it reads emotion,” Hagerty said. But, she added, in reality the system would read facial movements and then combine them with the assumption that those movements are related to emotions – for example, a smile means someone is happy.
“There’s a lot of really solid science that says it’s too simple; it doesn’t quite work that way, ”Hagerty said, adding that even simple human experience showed it was possible to fake a smile. “This is what this game was: to show that you didn’t change your inner state of feeling six times quickly, you just changed your appearance. [on your] face, ”she said.
Some emotion recognition researchers say they are aware of these limitations. But Hagerty said the hope is that the new project, which is funded by Nesta (National Endowment for Science, Technology and the Arts), will raise awareness of the technology and foster discussion around its use.
“I think we are starting to realize that we are not really ‘users’ of technology, we are citizens in a world deeply shaped by technology, so we need to have the same type of democratic and citizen contribution on these technologies. that we have on other important things in companies, ”she said.
Vidushi Marda, senior program manager at human rights organization Article 19, said it was crucial to “take a break” from the growing market for emotion recognition systems.
“The use of emotion recognition technologies is of deep concern because not only are these systems based on discriminatory and discredited science, but their use is also fundamentally incompatible with human rights,” she said. “An important learning from the trajectory of facial recognition systems around the world has been to question the validity and need of technologies early and often – and projects that focus on the limits and dangers of recognition. emotions are an important step in that direction. “
[ad_2]
Source link