Solent Unviersity Southampton logo
Solent Unviersity Southampton logo
Skip to main content

Postdoctoral Researcher, Dr Garfield Benjamin comments on news reports of a wristband which would allow an employer to track your emotional state…

18th January 2021
Research

The MoodBeam bracelet probably ticks all the boxes for a company wanting a quick fix for the mental health of their employees. But without context, the data will be meaningless, and if it is used anyway it is likely to be harmful.

There are so many ways this technology could cause discrimination. Someone who is non-neuronormative, someone on hormone replacement therapy, or someone from a culture with different emotional norms, will likely find themselves flagged as a ‘problem’ by an overzealous app and paranoid HR departments.

What are the implications of answering this honestly? Would pressing ‘sad’ too often lead to someone’s ability to do their work being called into question? Far more important than how the data is gathered, is what a company would actually do about it. And as feminist scholar Sara Ahmed writes, “when we describe a problem we become a problem”.

These tools are likely to exacerbate existing workplace prejudices and inequalities. The issues and pressures people are facing simply cannot be measured in this way.

Even the business-like framework of a dashboard showing who is and isn’t ‘coping’ is not fit for purpose, in the pandemic or at any other time. The idea that clicking ‘happy’ or ‘sad’ is a real measure of how someone is feeling is very dangerous. More generally, mental health is not something that should be metricised, least of all by our employers.

Mireille Hildebrandt describes parts of our lives that form an “incomputable self”, and the importance of keeping these aspects of ourselves out of data-hungry platforms. It speaks to the problem of converting people into data, of constantly quantifying and categorising us in ways that ignore the lived reality of what we are experiencing. Companies keeping track of their employees’ mood in this way raises a whole host of questions. How will they interpret the data? What will they decide using the data? How long will this data be kept? Who else has access to it? How might it be used in the future? The potential for abuse and misuse is huge.

At a time when some employers are already inserting themselves into people’s homes, blurring the work/life boundaries in invasive ways - things like tracking productivity or other working from home surveillance tools - tech that pretends to solve a personal or social issue in this way risks setting dangerous precedents. We don’t need any more surveillance in our homes to add extra pressures of feeling constantly watched on top of a pandemic.

It is particularly disturbing that the idea came from tracking children, a situation where an open, supportive and ongoing conversation is needed. As I have written about, privacy is part of a set of cultural norms around how we interact with technology. It includes considerations of our identity and community, and sharing - even being vulnerable - is an important part of this, providing it is within an appropriate and caring context.

There are many ways technology can enable support for mental health and related issues, but a happy/sad bracelet sounds more like something from a dystopian story than one of these important tools.

Dr Garfield Benjamin's research spans cultural theory and creative media practice, focusing on the relation between humans and (digital) technology.