University of Glasgow research identifies 80 online harms as part of big tech regulation challenge
Researchers at the University of Glasgow have identified 80 ‘distinct’ online harms facing policy-makers as they grapple with new ways to regulate tech platforms.
The analysis was conducted by academics exploring the range of issues to consider from eight tech industry government and parliamentary reports and inquiries over the past 18 months.
Their work has fed into a new high-level research report conducted as part of the Arts & Humanities Research Council Creative Industries Policy and Evidence Centre.
The reports, published between September 2018 – February 2020, dealt with issues such as online harms, cybercrime, and the regulation of social media platforms.
The University of Glasgow researchers’ analysis has revealed that some 80 distinct online harms have been discussed. The regulatory landscape is ‘cluttered’ with no less that nine different UK agencies with separate responsibilities.
Professor Martin Kretschmer, professor of intellectual property law and director of the UK Copyright and Creative Economy Centre (CREATe) at the University of Glasgow, one of the report authors, said: “Fake news, cyber-attacks, predatory acquisitions. Dangerous things are happening on online platforms. But how do we, as a society, make decisions about undesirable activities and content?
“UK policy-makers hope to delegate tough choices to the platforms themselves, focusing on codes of practice and codes of conduct supervised by regulatory agencies, such as the Office of Communications (Ofcom), for a new ‘online duty of care’, and competition regulator Competition and Markets Authority (CMA), through a ‘digital markets unit’. Our new empirical study shows how this approach emerged, and how it compares in a global setting.”
The researchers also express concern that the evolving regulatory structure appears to be blind to the effects of platforms on cultural production and diversity. Understanding the role of ranking and recommendation algorithms as cultural gatekeepers still needs to be integrated into the platform policy agenda.
Professor Philip Schlesinger, professor in cultural theory (Centre for Cultural Policy Research and CREATe), said: “Platform regulation is now at the heart of how democracies conduct themselves. It’s also increasingly at the core of how we manage rules for our digital social connectedness.
“So, understanding how regulation works and the forces that are shaping it have become crucial for everyone. It’s important that the present rush to regulate doesn’t ignore the huge contribution of creative industries to the cultural economy. And since the UK’s multinational diversity has been thrown increasingly into relief by Brexit and the pandemic, how regulatory policy plays out will be of special interest to the devolved administrations.”
The researchers found as platform regulation has become an important government issue, this major focus has seen two clear priority areas emerge: online harms and competition. These broad categories include everything from mental health to intellectual property rights.
The picture the researchers paint is of a complex and potentially confusing policy environment. There is a limited consensus on what regulation could look like, or even how to define key terms like ‘platform’.
The research also found that US multinationals – Google and Facebook in particular – have captured regulators’ attention, and they dominate references in the official literature.