Chief Executive Matt Tee on the Department of Digital, Culture Media and Sport Select Committee’s report on fake news and disinformation.
About this time last year I was due to give oral evidence to the Department of Digital, Culture Media and Sport Select Committee’s Inquiry on Fake News. At the time, the Inquiry had been very much focused on news-type content, and I was due to give evidence with Ofcom and IMPRESS. My appearance was postponed, however, for the Committee to hear from a single mysterious witness.
This turned out to be Chris Wylie, the Cambridge Analytica whistleblower.
The discovery of the Facebook data issues changed the nature of the Inquiry and extended it to the point that the report has only just been published. I was never called back.
While the report talks about ‘platforms’, it concentrates almost exclusively on Facebook and the companies that use its data to target and deliver ‘advertising’ to selected audiences. Much of the Inquiry focuses on using Facebook for political messages and advertising and the effect that may have had in international elections and especially the Brexit referendum.
The Committee makes numerous recommendations, many of which seem sensible. They seek to resolve the vexed issues of ‘platform’ v ‘publisher’, by suggesting a new category for tech companies. They recommend that tech companies should have a legal liability to act against harmful or illegal content, underpinned by a code of ethics, like the Broadcasting Code. The powers of existing regulators should be extended to be enable them to require information from tech companies, including the workings of their algorithms.
On the world of politics, the Committee recommends that all political ‘advertising’ should be publicly available, including the source of the funding. They also call for all political parties to work with the ICO, Cabinet Office and Electoral Commission to improve transparency over commonly held data sets.
There are, however, several significant issues that may impinge on the implementation of the Committee’s response. Two of them are particularly relevant to IPSO.
The first of these is globalisation. The tech companies are global and their platforms can be accessed almost worldwide. Assuming that a UK regulator is given a power to enforce FacEbook’s liability to act against harmful or illegal content, how is the regulator going to define which content is within the jurisdiction? And is the regulator (as opposed to the courts) going to decide when content is illegal?
A second issue is political advertising. Much of this content may not even look like an advert. It may not mention a party or cause, but it will have been paid for with the intention of changing people’s views in a way that will advance a party or cause. Political advertising, even outside of the tech environment is already a regulatory anomaly – IPSO doesn’t regulate paid for content (and expects it to be clearly identified as such), however much it looks like editorial. The Advertising Standards Authority, which does in some instances, doesn’t regulate political advertising, for reasons that its Chief Executive, Guy Parker, explains here.