Privacy watchdog urges companies drop emotional analysis AI software PlatoBlockchain Data Intelligence. Vertical Search. Ai.

Privacy watchdog urges companies drop emotional analysis AI software

Companies should think twice before deploying AI-powered emotional analysis systems prone to systemic biases and other snafus, the UK’s Information Commissioner’s Office (ICO) warned this week.

Organizations face investigation if they press on and use this sub-par technology that puts people at risk, the watchdog added.

Machine-learning algorithms purporting to predict a person’s moods and reactions use computer vision to track gazes, facial movements, and audio processing to gauge inflection and overall sentiment. As one might imagine, it’s not necessarily accurate or fair, and there may be privacy problems handling data for training and inference.

Indeed, ICO deputy commissioner Stephen Bonner said the technology isn’t foolproof, leading to systemic bias, inaccuracy, or discrimination against specific social groups. “Developments in the biometrics and emotion AI market are immature. They may not work yet, or indeed ever,” he said in a statement. 

“While there are opportunities present, the risks are currently greater. At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are inaccurate and lead to discrimination,” his statement continued.

Bonner noted emotional analysis software is often paired with biometric systems, collecting and storing a wider range of personal information beyond than facial images. Software providers are often not transparent about how this personal data is processed or saved. 

“The only sustainable biometric deployments will be those that are fully functional, accountable and backed by science,” he said. “As it stands, we are yet to see any emotion AI technology develop in a way that satisfies data protection requirements, and have more general questions about proportionality, fairness and transparency in this area.”

Which seems a polite way of saying: you should only be using emotion-reading AI models that work as the ICO expects, and those models don’t exist, so you should just stop what you’re doing.

Britain’s data privacy watchdog is expected to publish a report detailing how biometric data, including facial recognition, fingerprint identification, and voice recognition should be handled by companies next year in spring.

“The ICO will continue to scrutinize the market, identifying stakeholders who are seeking to create or deploy these technologies, and explaining the importance of enhanced data privacy and compliance, whilst encouraging trust and confidence in how these systems work,” Bonner added.

Experts have sounded the alarm over sentiment and emotion analysis for years. Feelings are subjective and it’s difficult to interpret them accurately. Humans often can’t do this effectively for themselves or others, let alone machines. ®

Time Stamp:

More from The Register