After more than 40 years of operation, DTVE is closing its doors and our website will no longer be updated daily. Thank you for all of your support.
DCMS committee calls for regulation of online platforms
A compulsory code of ethics, overseen by an independent regulator, should be brought in to monitor online platforms like Facebook and bring an end to self-regulation, according to a UK parliamentary report.
The Digital, Culture, Media and Sport select committee’s damning final report on ‘Disinformation and ‘fake news,’’ published today after an 18-month inquiry, recommended regulation to clamp down on harmful or illegal online content.
It called for the new regulator to be funded by a levy on tech companies operating in the UK and said that it should have statutory powers to launch legal action and levy “hefty fines” against companies in breach of the proposed ethics code.
“We need a radical shift in the balance of power between the platforms and the people. The age of inadequate self-regulation must come to an end,” said Damian Collins MP, chair of the DCMS committee.
“The rights of the citizen need to be established in statute, by requiring the tech companies to adhere to a code of conduct written into law by parliament and overseen by an independent regulator.”
With a forthcoming government white paper due to be published on online harms, the committee said it hopes to see “firm recommendations” for a regulatory system for online content that is “as effective as that for offline content”.
“Social media companies cannot hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content of their sites,” according to the report.
The DCMS committee said that a new category of tech company should be created that tightens online companies’ liabilities – without not necessarily casting them as either ‘platform’ or ‘publisher’.
It said the code of ethics should set down in writing what is acceptable on social media and be similar to the Broadcasting Code issued by Ofcom, which is based on the guidelines
established in section 319 of the 2003 Communications Act.
The inquiry into disinformation and ’fake news’ was announced in September 2017 and much of the evidence it scrutinised focused on the business practices of Facebook – before, during and after the Cambridge Analytica scandal.
The final report was highly critical of Facebook, with Collins claiming that CEO Mark Zuckerberg, who chose not to appear before the committee, “continually fails to show the levels of leadership and personal responsibility that should be expected from someone who sits at the top of one of the world’s biggest companies.”
Other recommendations included in the report were for the UK Government to reform current electoral communications laws and rules on overseas involvement in UK elections, claiming that current regulations are “not fit for purpose”.
The final DCMS report follows an interim report that was published in July 2018. Overall the committee held 23 spoken evidence sessions, received more than 170 written submissions and heard evidence from 73 individuals.
Representatives from eight countries were also invited to join the DCMS Committee to create an ‘International Grand Committee’, in a bid to establish a united global front in tackling the spread of disinformation. The inaugural session of this committee took place in November 2018.