Emotion-sensing technology – friend or foe?


Developments in emotion-sensing technology mean it could soon appear in a workplace near you. But do the benefits outweigh the privacy concerns?

Wouldn’t it be great if your manager could know when you were a bit stressed-out, without you having to put it into words? And on the other side of the coin, surely management would benefit from knowing when an employee is too distracted to perform a task, and likely to commit a serious blunder?

Emotion-sensing technology (EST) could be the next big thing in employee performance, says a recent report by MIT Sloan.  By analysing employee body language – such as eye movement, expressions and skin response, the use of EST can let employers know when workers are overly stressed or otherwise emotionally distraught.

The technology has actually been around for a while – in 2009, the ‘Rationalizer’ device was developed by Phillips and Dutch bank ABN AMRO to prevent rash and hazardous decisions by investment bankers. A wearable, the Rationalizer works similarly to a lie detector, measuring emotional arousal through skin response. The device not only helps users rethink their decisions by making them aware that they are operating in an emotional state, but also feeds employers data about the triggers and environmental factors behind risky decisions.

New AI developments

The emotion-detection and recognition market is growing in prominence, and is expected to experience significant growth over the next few years. By 2022, it is predicted to be worth an estimated $US 38 billion worldwide.

Emotion-detection is also increasingly powered by artificial intelligence. According to Annette Zimmerman, research president at Gartner, “By 2022, your personal device will know more about your emotional state than your own family”.

Her colleague Roberta Cozza, research director at Gartner, says of the developments in this space: “Prototypes and commercial products already exist and adding emotional context by analysing data points from facial expressions, voice intonation and behavioural patterns will significantly enhance the user experience.

“Beyond smartphones and connected home devices, wearables and connected vehicles will collect, analyse and process users’ emotional data via computer vision, audio or sensors capturing behavioural data to adapt or respond to a user’s wants and needs.”

Currently available in this space, smartwatches and fitness trackers are able to detect stress levels through electrodermal activity by monitoring changes in heart rate and sweat – micro-changes of which we ourselves are unaware.

Chatbots are also becoming more adept at recognising human emotion. Take the Chinese developed ‘Emotibot’ for example. The app is able to detect patterns in user-emotion through multiple sources, such as text activity, tone of voice and facial expressions. Technology like this can understand our needs before we do, and can instruct us how to adapt our environment accordingly.

Algorithms and hardware

While not exactly an emotion-detector, Barcelona-based company Telefónica I+D have developed an algorithm that detects boredom and distraction by analysing smartphone activity. It looks at their battery consumption, whether they have logged into Instagram and how often they check their email. According to the MIT report, it can identify user boredom 80 per cent of the time. Essentially, it knows whether employees are just killing time or actually working.

There are also emotion detectors in hardware. MIT’s Affective Computing Lab, for example, was able to identify user stress levels by how hard they pushed on a keyboard and how they hold their mouse. And webcams can record stress-related increases in heart rate through analysis of the reflected light on the users face.

A privacy concern?

While these tools can work to prevent stress and burnout or detect conflict, there are also significant privacy concerns if they are to be widely adopted by organisations. In a previous article on HRMonline  about wearable technology in the workplace, ethics professor Petrina Coventry says that with such monitoring, “Employers need to think carefully about the way in which they collect, use and disclose information they obtain from employees.”

Surely knowing that you are being monitored to such a degree could make you feel like you’re in a fishbowl and increase anxiety about performance – well in this writer’s humble opinion anyway.

In a different article on HRMonline Michael Byrnes from Clayton Utz pondered whether employers could compel workers to be microchipped. If workplace health and safety is at stake, there is some legal precedent for reasonable intrusions into privacy – such as drug and alcohol testing for those operating heavy machinery. The same principles could apply to certain emotion-sensing technology.

“It’s worth remembering that just because a technological development enables a particular process or innovation to be adopted for employees doesn’t mean it automatically can be,” wrote Byrnes.

Happy monitoring… get it? As in have fun monitoring the happy…


Examine HR’s ethical role in an organisation and learn to use a case based approach to deal with ethical dilemmas, in the AHRI short course ‘Workplace ethics’.

Subscribe to receive comments
Notify me of
guest

3 Comments
Inline Feedbacks
View all comments
Tamara S
Tamara S
6 years ago

This could be a positive step in identifying symptoms of depression with the aim to provide assistance and early intervention. However, as with the introduction of any AI driven technology, there are a number of ethical considerations. In addition, just knowing you’re being monitored will alter normal emotional responses – what’s the consequences and troubleshoot when the sensor gets its wires crossed?

Bradley
Bradley
6 years ago

How can we trust the holders of this information to use it ethically when financial gain is involved? Facebook, medical records, facial recognition in public… I’m not sure I feel safe having the world know my private life. This data is never fully secure.

trackback
Boundless Leadership: When other people ruin your day - Thrive Global
1 year ago

[…] technology. There was a remarkable article in HR Magazine that outlined the innovations in emotional sensors being developed for artificial intelligence. The […]

More on HRM

Emotion-sensing technology – friend or foe?


Developments in emotion-sensing technology mean it could soon appear in a workplace near you. But do the benefits outweigh the privacy concerns?

Wouldn’t it be great if your manager could know when you were a bit stressed-out, without you having to put it into words? And on the other side of the coin, surely management would benefit from knowing when an employee is too distracted to perform a task, and likely to commit a serious blunder?

Emotion-sensing technology (EST) could be the next big thing in employee performance, says a recent report by MIT Sloan.  By analysing employee body language – such as eye movement, expressions and skin response, the use of EST can let employers know when workers are overly stressed or otherwise emotionally distraught.

The technology has actually been around for a while – in 2009, the ‘Rationalizer’ device was developed by Phillips and Dutch bank ABN AMRO to prevent rash and hazardous decisions by investment bankers. A wearable, the Rationalizer works similarly to a lie detector, measuring emotional arousal through skin response. The device not only helps users rethink their decisions by making them aware that they are operating in an emotional state, but also feeds employers data about the triggers and environmental factors behind risky decisions.

New AI developments

The emotion-detection and recognition market is growing in prominence, and is expected to experience significant growth over the next few years. By 2022, it is predicted to be worth an estimated $US 38 billion worldwide.

Emotion-detection is also increasingly powered by artificial intelligence. According to Annette Zimmerman, research president at Gartner, “By 2022, your personal device will know more about your emotional state than your own family”.

Her colleague Roberta Cozza, research director at Gartner, says of the developments in this space: “Prototypes and commercial products already exist and adding emotional context by analysing data points from facial expressions, voice intonation and behavioural patterns will significantly enhance the user experience.

“Beyond smartphones and connected home devices, wearables and connected vehicles will collect, analyse and process users’ emotional data via computer vision, audio or sensors capturing behavioural data to adapt or respond to a user’s wants and needs.”

Currently available in this space, smartwatches and fitness trackers are able to detect stress levels through electrodermal activity by monitoring changes in heart rate and sweat – micro-changes of which we ourselves are unaware.

Chatbots are also becoming more adept at recognising human emotion. Take the Chinese developed ‘Emotibot’ for example. The app is able to detect patterns in user-emotion through multiple sources, such as text activity, tone of voice and facial expressions. Technology like this can understand our needs before we do, and can instruct us how to adapt our environment accordingly.

Algorithms and hardware

While not exactly an emotion-detector, Barcelona-based company Telefónica I+D have developed an algorithm that detects boredom and distraction by analysing smartphone activity. It looks at their battery consumption, whether they have logged into Instagram and how often they check their email. According to the MIT report, it can identify user boredom 80 per cent of the time. Essentially, it knows whether employees are just killing time or actually working.

There are also emotion detectors in hardware. MIT’s Affective Computing Lab, for example, was able to identify user stress levels by how hard they pushed on a keyboard and how they hold their mouse. And webcams can record stress-related increases in heart rate through analysis of the reflected light on the users face.

A privacy concern?

While these tools can work to prevent stress and burnout or detect conflict, there are also significant privacy concerns if they are to be widely adopted by organisations. In a previous article on HRMonline  about wearable technology in the workplace, ethics professor Petrina Coventry says that with such monitoring, “Employers need to think carefully about the way in which they collect, use and disclose information they obtain from employees.”

Surely knowing that you are being monitored to such a degree could make you feel like you’re in a fishbowl and increase anxiety about performance – well in this writer’s humble opinion anyway.

In a different article on HRMonline Michael Byrnes from Clayton Utz pondered whether employers could compel workers to be microchipped. If workplace health and safety is at stake, there is some legal precedent for reasonable intrusions into privacy – such as drug and alcohol testing for those operating heavy machinery. The same principles could apply to certain emotion-sensing technology.

“It’s worth remembering that just because a technological development enables a particular process or innovation to be adopted for employees doesn’t mean it automatically can be,” wrote Byrnes.

Happy monitoring… get it? As in have fun monitoring the happy…


Examine HR’s ethical role in an organisation and learn to use a case based approach to deal with ethical dilemmas, in the AHRI short course ‘Workplace ethics’.

Subscribe to receive comments
Notify me of
guest

3 Comments
Inline Feedbacks
View all comments
Tamara S
Tamara S
6 years ago

This could be a positive step in identifying symptoms of depression with the aim to provide assistance and early intervention. However, as with the introduction of any AI driven technology, there are a number of ethical considerations. In addition, just knowing you’re being monitored will alter normal emotional responses – what’s the consequences and troubleshoot when the sensor gets its wires crossed?

Bradley
Bradley
6 years ago

How can we trust the holders of this information to use it ethically when financial gain is involved? Facebook, medical records, facial recognition in public… I’m not sure I feel safe having the world know my private life. This data is never fully secure.

trackback
Boundless Leadership: When other people ruin your day - Thrive Global
1 year ago

[…] technology. There was a remarkable article in HR Magazine that outlined the innovations in emotional sensors being developed for artificial intelligence. The […]

More on HRM