Spotlight
December 1, 2024

Data on our minds: affective computing at work

Digital profiling using algorithms and datasets is becoming more and more pervasive in every aspect of life. Alongside our experiences as consumers, citizens and patients, the accumulation of data and technological monitoring is increasingly focusing on people’s behaviour and activities in the workplace. This is called 'affective computing'.

The introduction of Algorithmic Affect Management (AAM) technologies - designed to take these inferences about our emotions and behaviours at work and connect them to algorithmic management systems - is opening a new frontier in surveillance and privacy concerns. This has significant regulatory implications across domains, from definitions to the need for specific, updated protection in both hard and soft law.

Our latest report examines how these technologies - while offering potential productivity gains and occupational health and safety benefits - also introduce significant challenges to job quality and wellbeing. Invasive surveillance practices often undermine the promised benefits, highlighting the need for conscious management to ensure ethical and responsible implementation.

Key findings include:

  • A shift in managerial decision-making, where automated inferences, classifications, and measurements related to workers' identities and potential behaviours are becoming increasingly significant.
  • While AAM has the potential to improve work design and conditions, the report highlights that certain applications are linked to exploitative practices and increased technostress, driving new forms of harm to health and labour rights.
  • The current patchwork of protections is insufficient to safeguard workers' privacy, physiological, and mental integrity when AAM technologies are deployed.
  • There is an urgent need for regulations that address the multifaceted challenges posed by AAM, including the potential for direct or indirect discrimination based on protected characteristics and the risks associated with ‘neurosurveillance’.

This report calls for robust regulations and proactive design to mitigate risks of bias, privacy infringements, and mental strain. By addressing these challenges, we can ensure technology adoption aligns with the principles of good work, safeguarding not only productivity but the wellbeing of all workers.

-

This report is a fantastic contribution to debates about the adoption of technology in the workplace, highlighting the near-term risks and opportunities in the adoption of affective computing. It highlights the crucial importance of worker participation to ensure that these technologies have positive impacts on worker health, wellbeing and agency. - Jeni Tennison, Founder and Executive Director of Connected by Data

An incisive, state of the art examination of the use of “emotion AI” at work. It’s a remarkably revealing document, offering extraordinary insights into the danger of workplace surveillance powering affective manipulation. - Frank Pasquale, Professor of Law at Cornell Tech University

With clear, on-point examples, this timely report exposes the dangers such technologies pose to workers' health and safety, shedding light on the profound risks of misuse. A must-read for anyone concerned about the intersection of AI, labour, and OSH - Dr. Aude Cefaliello, Senior Researcher - Health, Safety & WorkingConditions
ETUI - European Trade Union Institute

This groundbreaking report represents a crucial advancement in our understanding of affective algorithmic management and its profound implications for the future of work. By shedding light on the nuanced risks, ethical dilemmas, and opportunities presented by these technologies, it establishes an urgent call for systematic action and paves the way for informed policymaking, ensuring that innovation is harnessed responsibly to protect workers' rights and promote dignity, fairness, and well-being in the modern workplace. - Professor Peter Bloom, Professor of Management at the University of Essex

Read the reportRead the report

Author

Professor Phoebe Moore and Dr Gwendolin Barnard

Publication type

Report

Programme

Prioritising people

Sign up to our newsletter

We would love to stay in touch.

Our newsletters and updates let you know what we’ve been up to, what’s in the pipeline, and give you the chance to sign up for our events.

You can unsubscribe at anytime by clicking the link at the bottom of our emails or by emailing data@ifow.org. Read our full privacy policy including how your information will be stored by clicking the link below.