Head of Standards Charlotte Urwin on Ofcom’s first annual Online Nation report.
According to a new study published by Ofcom and the Information Commissioners Office, adults now spend almost 50 days of their year online.
Whilst some of that time will be spent shopping online; completing questionnaires to work out which type of cheese you are; or, my personal favourite, watching videos of dogs trying to carry big sticks; that time will also be spent looking at content on sites owned by newspapers and magazines and likely regulated by IPSO.
The issue of how readers engage with online content and how they distinguish regulated content from the unregulated mass, is a challenge that regulators, government departments and researchers are all trying to grapple with. Do readers distinguish between different kinds of content on social media? What mechanisms can be adopted to help readers to distinguish that content and, in particular, to help them to identify content which is deliberately misleading.
Most people in the surveyed group reported potentially harmful online experiences. The potential online harms most commonly encountered by adults were unsolicited emails (34% experienced in the past year), fake news (25%) and scams or fraud (22%). Whilst we have all received scam emails in the past (I have lost count of how many times I have won lotteries I never entered), I was interested to see that a quarter of those surveyed reported encountering fake news.
At IPSO, we want all citizens to have appropriate levels of media literacy to make informed decisions about what sorts of news they would like to access. They should be able to identify and avoid harmful fake news, and know how to identify curated and edited content displaying high-quality journalism. We would also expect consumers to have awareness of the methods available to seek redress from the regulated press when journalists do get things wrong.
We believe that independent regulation has a key role to play in this, by holding publishers accountable to an external set of standards and helping consumers to easily identify edited, curated, professionally produced products.
The research team also found that support for greater online regulation appears to have increased in a range of areas. Most adults favour tighter rules for social media sites (70% in 2019, up from 52% in 2018); video-sharing sites (64% v. 46%); and instant-messaging services (61% v. 40%).
This is very timely, given the ongoing consultation on the white paper produced by the Department of Culture, Media and Sport on online harms which aims to improve internet safety by requiring digital platforms such as social media companies and search engines to take responsibility for potentially harmful content.
Amongst other recommendations, the white paper suggests there should be a code of practice for social networks and internet companies, defined and enforced by a new (or existing) regulator, which would have the power to fine issue critical notices and could also potentially fine company executives and even block harmful websites.
The consultation on online harms launched alongside the white paper poses some pertinent questions including how any new regulator might operate and what the scope of their powers might be and how the new regulator might work in a world of increasing globalisation. Also of interest were proposals relating to media literacy, suggesting a strategy to teach people to recognise deceptive and malicious behaviours online.
Ofcom research shows that both the opportunities and challenges posed by the internet will be with us for a long time to come. The question is how organisations and individuals can come together to address those challenges and harness those opportunities.