Solent Unviersity Southampton logo
Solent Unviersity Southampton logo
Skip to main content

Independence and cooperation are essential for regulating online content says Dr Garfield Benjamin, Postdoctoral Researcher in Sociology

2nd July 2021
Health, psychology and sociology

The UK’s proposed Online Harms Bill, currently set to undergo scrutiny by MPs before it goes towards votes in Parliament, marks some important steps to tackling the problems caused by online platforms. But it also raises a lot of questions, particularly around who gets to say what content is or isn’t allowed online.

The Online Harms Bill in principle is a welcome move to deal with social media platforms’ failures to tackle issues like hate speech and abuse on their platforms. But in many ways it misses the mark. A lot of the “easier” (at least legally speaking) problems to deal with - overtly racist abuse, for example - should be able to be handled under existing anti-discrimination and hate speech laws.

Part of the problem with online platforms, however, is that they haven’t yet been properly regulated with these existing mechanisms. So there has been a push from lots of different groups, including researchers and civil society groups, for more targeted rules surrounding what kinds of content Facebook, Twitter, TikTok and the like should block on their services. This is particularly important in political contexts.

The difficulties come with comments that are “harmful but not illegal”, as this will inevitably involve some kind of judgement that knocks against issues of free speech. But having free speech does not mean you are free from responsibility for what you say, and that could well lead to being banned from an online platform (just like you would be asked to leave a restaurant if you start abusing the staff or other diners).

Any social setting has a set of norms - the sometimes unspoken rules and expectations - that guide our behaviours and set what is or isn’t allowed in that context. So, many of the issues with free speech in regulating online platforms are overblown, although it has consistently been a topic that the platforms themselves (especially Facebook) have pushed to prevent regulation.

Some of the problems with UK proposals can instead be seen as part of the government’s so-called culture war, echoing moves in education, health and elsewhere to control the narratives of social and political life. DCMS (the Department for Culture, Media and Sport), and the minister in charge of it, are pushing for more direct control over what is and isn’t allowed.

Similar issues are happening with DCMS’s interventions in other areas like the BBC or the Proms, and many of these areas fall under OfCom, the regulator that has traditionally focused on TV and radio but is being given the new role of handling digital content in the Online Harms Bill.

The issue now becomes not a matter of individuals being unable to use their right to free speech, but whether whole sections of society are amplified or silenced at the whims of the government. There are legitimate fears that the Online Harms Bill could end up protecting racists, sexists, homophobes and transphobes while removing the voice of the groups targeted by such abuse.

If the government bans anyone from criticising it, or from discussing important issues and systemic injustices like racism and colonial history, then existing problems will only get worse. Already online platforms harmfully misrepresent certain groups - like image searches showing only white male doctors or adverts only targeting privileged people with better job or financial offers. A heavy handed minister leaning on regulators could make this much worse.

So the problems come - at least partially - down to independence, scope and whether the regulator will have any teeth of their own. In my Digital Society report last year I mapped the messy state of UK regulation when it comes to online platforms. It’s spread across the ICO for issues of privacy, OfCom for online content, the CMA for monopolies, the ASA for adverts, the Children’s Commissioner for cases of children and young people, as well as multiple government departments each with a vested interest in particular aspects of digital society.

I argue that we need not only stronger power for regulators but greater cooperation. We can’t separate out what is being said (OfCom and Online Harms) from who is saying it (linked to who we are which includes personal data and the ICO). Nor can we separate out user content from advertising or political content.

We need a mouthpiece for the different regulators to come together and tackle issues embedded in the design of social media and other online platforms, not just sticking a bandage on to cover up the huge inequalities these platforms create.

Perhaps most importantly, regulators need independence from the government, so that they can act as a voice for representing marginalised and targeted groups. This is essential if Online Harms regulation is going to prevent harmful speech online without silencing those who rely on social media to find communities and support.

Dr Garfield Benjamin is a Post Doctoral Researcher in sociology. You can read more about his areas of interest and research on this academic profile https://www.solent.ac.uk/staff-profiles/garfield-benjamin