Social media giants like Facebook need much stricter regulations to prevent them getting away with behaving like “digital gangsters”, a UK parliamentary report on fake news concluded on Monday.
In its final report at the end of an 18-month investigation, the House of Commons Digital, Culture, Media and Sport (DCMS) Selection Committee accused Facebook co-founder and CEO Mark Zuckerberg of showing contempt for Parliament in refusing three separate demands to give evidence, and instead sending junior employees unable to answer all the questions.
“Companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and beyond the law,” the report notes.
“By choosing not to appear before the Committee and by choosing not to respond personally to any of our invitations, Mark Zuckerberg has shown contempt towards the UK Parliament,” notes the report, which is extremely critical of Facebook’s online model.
The committee began conducting its inquiry into the spread of fake news last year in the wake of the Cambridge Analytica (CA) data breach scandal, which revealed widespread misuse of public data by the now-defunct British company.
The committee’s ‘Disinformation and ‘fake news’: Final Report’ reiterates the role played by CA’s parent company, SCL Group, in some foreign elections in countries like India, which had also been among the conclusions of its interim report last year.
“Data analytics firms have played a key role in elections around the world. Strategic communications companies frequently run campaigns internationally, which are financed by less than transparent means and employ legally dubious methods,” the report notes.
“As we said in our Interim Report, SCL Elections and its associated companies, including Cambridge Analytica, worked on campaigns that were not financed in a transparent way, overstepping legal and ethical boundaries,” it adds.
Besides India, some of the other countries that came under the remit of SCL’s “political work” included elections campaigns in African countries like Kenya, Ghana and Nigeria; Caribbean nations like Trinidad and Tobago; and others like Pakistan and Australia.
“Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised ‘dark adverts’ from unidentifiable sources, delivered through the major social media platforms we use every day,” concluded the report.
“The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data privacy rights,” it added.
The cross-party parliamentary committee called for sites such as Facebook to be brought under regulatory control, arguing “social media companies cannot hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content of their sites”.
In order to better regulate social media firms like Facebook, the MPs suggested creating a new category of tech firm – one that was neither a platform nor a publisher but something in-between, which would tighten the legal liability for content identified as harmful.
Facebook has repeatedly said it is committed to fighting fake news and works with more than 30 fact-checking organisations around the world.
In response to the report it said it shared the committee’s concerns about “false news and election integrity” and claimed to have had made a significant contribution to its investigation over the past 18 months, “answering more than 700 questions and with four of its most senior executives giving evidence”.
Facebook said in its statement: “We are open to meaningful regulation and support the committee’s recommendation for electoral law reform. But we’re not waiting. We have already made substantial changes so that every political ad on Facebook has to be authorised, state who is paying for it and then is stored in a searchable archive for seven years.