In February I met with Jackie Doyle-Price, Minister for Mental Health, Inequalities and Suicide Prevention. Much of our conversation focussed on how to tackle the widespread availability of material related to self-harm and suicide on social media platforms. Concern has also been expressed in all quarters about the availability of illegal material online, as well as the challenges of addressing harmful behaviour such as stalking and trolling.
This week the Department of Culture, Media and Sport published a white paper on online harms which aims to improve internet safety by requiring digital platforms such as social media companies and search engines to take responsibility for potentially harmful content. The proposals are presented as one way of tackling some of those challenges.
The plans cover a range of issues defined in law − terrorist content and child sex abuse for example – as well as other serious issues such as material about self-harm and suicide as well as possible damaging behaviour that has a less clear legal definition like trolling and spreading disinformation.
The white paper also suggests there should be a code of practice for social networks and internet companies, defined and enforced by a new (or existing) regulator, which would have the power to fine issue critical notices and could also potentially fine company executives and even block harmful websites.
Many have welcomed these proposals, including those concerned about the impact of harmful content on vulnerable users like children and young people and others who feel “the digital sphere has become too lawless”, as Ian Murray, Executive Director of the Society of Editor’s set out in his response to the announcement, whilst warning about the potentially negative impact on press freedoms.
Concerns have been raised by freedom of speech campaigners that the proposals could result in censorship and state regulation of speech, with Matthew Lesh, head of research at free market think tank the Adam Smith Institute, describing them as “a historic attack on freedom of speech and the free press” and former Secretary of State John Whittingdale describing them as a “draconian censorship regime”.
The new proposals are certainly interesting from an IPSO perspective.
In an interview given to Radio 4’s Today programme,Secretary of State Jeremy Wright made clear that the target of any legislation was user generated content, adding “we’re not interested here in journalists’ content. We are not interested in what journalists write. And of course, what journalists say on the radio … what they write in newspapers, what they say on television, is controlled in other ways”. His comments were followed up in a letter of 10 April to Ian Murray, making clear that the Government has no intention to duplicate the work of IPSO in regulating the press.
That said, the 12 week consultation launched alongside the white paper poses some pertinent questions including how any new regulator might operate and what the scope of their powers might be, including ensuring social media publish annual transparency reports on the amount of harmful content on their platforms and what they’re doing to address this. Also of interest were proposals relating to media literacy, suggesting a strategy to teach people to recognise deceptive and malicious behaviours online alongside.
Certainly, the proposals put forward in the white paper are ambitious but they also highlight the difficult balance to be struck between providing real protections from online harm and protecting freedom of speech.