Written Evidence to the Inquiry on Disinformation and ‘fake news’, House of Commons Digital, Culture, Media and Sport Committee

Carlo Kopp, Kevin Burt Korb, Bruce Mills

Research output: Other contributionOther

Abstract

The potential damage produced by ‘fake news’ can be significant, and it should be treated as a genuine threat to the proper functioning of democratic societies;
The problem of social media users irresponsibly propagating falsehoods, without knowing these are falsehoods, and not making any effort to determine veracity, needs to be dealt with; It does not take much ‘fake news’, under some circumstances, to wholly disrupt consensus forming in a group; Regulatory or other mechanisms that might be introduced to disrupt, interdict or remove ‘fake news’ from social media will confront serious challenges in robustly identifying what is or is not ‘fake news’; The ‘fake news’ problem in mass media is similar in many ways to the quality assurance problems that bedeviled manufacturing industries decades ago. There is a potential payoff in establishing process based quality assurance standards for mass media, and mandating compliance with such standards; A model that should be explored is the exploitation of media organisations with track records of bias free and high integrity news reporting to provide fact checking services to social media organisations;
The commonly proposed model of “inoculation”, in which users are taught to think critically and test evidence, may be difficult and expensive to implement, and lazy social media users may simply not bother; In the long term, an educational system that rigorously teaches from the outset critical thinking, the ability to identify logical fallacies, and respect for tested fact over subjective opinion, would solve much of the ‘fake news’ problem as this would destroy much of the ‘market’ for ‘fake news’, but this does not address the near term problem we observe, and will require increased investment in education;
A demerit point system uniformly applied across social media platforms, where users are penalized for habitually propagating ‘fake news’ is an alternative to “inoculation”, but confronts the same problems as fact checking – who determines what is or is not ‘fake news’, and is this free of errors and bias?
A “silver bullet” solution to this problem may not exist, and strategies similar to those required to defeat biological contagions may be needed. These involve interdicting production and distribution, and “inoculation” of social media users.
There is potential for abuse, and potential for free speech to be impaired, where social media are subjected to censorship mechanisms, so any regulatory model has to be designed to prevent improper exploitation by any parties;
The Committee’s inquiry dealing with Disinformation and ‘fake news’ is immensely important, and not only for the UK, as other nations will almost certainly consider the UK’s response to the findings of this inquiry in framing their measures for dealing with this pervasive problem.
Original languageEnglish
TypeSubmission to Inquiry on Disinformation and ‘fake news’, House of Commons Digital, Culture, Media and Sport Committee
Media of outputWebsite, House of Commons
PublisherHouse of Commons Digital
Number of pages6
Place of PublicationUnited Kingdom
Publication statusPublished - 12 Dec 2018

Keywords

  • Social Media
  • Deception

Cite this

@misc{dcad894463c34cd4b76f67aa02f67e9e,
title = "Written Evidence to the Inquiry on Disinformation and ‘fake news’, House of Commons Digital, Culture, Media and Sport Committee",
abstract = "The potential damage produced by ‘fake news’ can be significant, and it should be treated as a genuine threat to the proper functioning of democratic societies; The problem of social media users irresponsibly propagating falsehoods, without knowing these are falsehoods, and not making any effort to determine veracity, needs to be dealt with; It does not take much ‘fake news’, under some circumstances, to wholly disrupt consensus forming in a group; Regulatory or other mechanisms that might be introduced to disrupt, interdict or remove ‘fake news’ from social media will confront serious challenges in robustly identifying what is or is not ‘fake news’; The ‘fake news’ problem in mass media is similar in many ways to the quality assurance problems that bedeviled manufacturing industries decades ago. There is a potential payoff in establishing process based quality assurance standards for mass media, and mandating compliance with such standards; A model that should be explored is the exploitation of media organisations with track records of bias free and high integrity news reporting to provide fact checking services to social media organisations; The commonly proposed model of “inoculation”, in which users are taught to think critically and test evidence, may be difficult and expensive to implement, and lazy social media users may simply not bother; In the long term, an educational system that rigorously teaches from the outset critical thinking, the ability to identify logical fallacies, and respect for tested fact over subjective opinion, would solve much of the ‘fake news’ problem as this would destroy much of the ‘market’ for ‘fake news’, but this does not address the near term problem we observe, and will require increased investment in education; A demerit point system uniformly applied across social media platforms, where users are penalized for habitually propagating ‘fake news’ is an alternative to “inoculation”, but confronts the same problems as fact checking – who determines what is or is not ‘fake news’, and is this free of errors and bias? A “silver bullet” solution to this problem may not exist, and strategies similar to those required to defeat biological contagions may be needed. These involve interdicting production and distribution, and “inoculation” of social media users. There is potential for abuse, and potential for free speech to be impaired, where social media are subjected to censorship mechanisms, so any regulatory model has to be designed to prevent improper exploitation by any parties; The Committee’s inquiry dealing with Disinformation and ‘fake news’ is immensely important, and not only for the UK, as other nations will almost certainly consider the UK’s response to the findings of this inquiry in framing their measures for dealing with this pervasive problem.",
keywords = "Social Media, Deception",
author = "Carlo Kopp and Korb, {Kevin Burt} and Bruce Mills",
note = "Parliamentary Inquiry Submission",
year = "2018",
month = "12",
day = "12",
language = "English",
publisher = "House of Commons Digital",
type = "Other",

}

Written Evidence to the Inquiry on Disinformation and ‘fake news’, House of Commons Digital, Culture, Media and Sport Committee. / Kopp, Carlo; Korb, Kevin Burt; Mills, Bruce.

6 p. United Kingdom : House of Commons Digital. 2018, Submission to Inquiry on Disinformation and ‘fake news’, House of Commons Digital, Culture, Media and Sport Committee.

Research output: Other contributionOther

TY - GEN

T1 - Written Evidence to the Inquiry on Disinformation and ‘fake news’, House of Commons Digital, Culture, Media and Sport Committee

AU - Kopp, Carlo

AU - Korb, Kevin Burt

AU - Mills, Bruce

N1 - Parliamentary Inquiry Submission

PY - 2018/12/12

Y1 - 2018/12/12

N2 - The potential damage produced by ‘fake news’ can be significant, and it should be treated as a genuine threat to the proper functioning of democratic societies; The problem of social media users irresponsibly propagating falsehoods, without knowing these are falsehoods, and not making any effort to determine veracity, needs to be dealt with; It does not take much ‘fake news’, under some circumstances, to wholly disrupt consensus forming in a group; Regulatory or other mechanisms that might be introduced to disrupt, interdict or remove ‘fake news’ from social media will confront serious challenges in robustly identifying what is or is not ‘fake news’; The ‘fake news’ problem in mass media is similar in many ways to the quality assurance problems that bedeviled manufacturing industries decades ago. There is a potential payoff in establishing process based quality assurance standards for mass media, and mandating compliance with such standards; A model that should be explored is the exploitation of media organisations with track records of bias free and high integrity news reporting to provide fact checking services to social media organisations; The commonly proposed model of “inoculation”, in which users are taught to think critically and test evidence, may be difficult and expensive to implement, and lazy social media users may simply not bother; In the long term, an educational system that rigorously teaches from the outset critical thinking, the ability to identify logical fallacies, and respect for tested fact over subjective opinion, would solve much of the ‘fake news’ problem as this would destroy much of the ‘market’ for ‘fake news’, but this does not address the near term problem we observe, and will require increased investment in education; A demerit point system uniformly applied across social media platforms, where users are penalized for habitually propagating ‘fake news’ is an alternative to “inoculation”, but confronts the same problems as fact checking – who determines what is or is not ‘fake news’, and is this free of errors and bias? A “silver bullet” solution to this problem may not exist, and strategies similar to those required to defeat biological contagions may be needed. These involve interdicting production and distribution, and “inoculation” of social media users. There is potential for abuse, and potential for free speech to be impaired, where social media are subjected to censorship mechanisms, so any regulatory model has to be designed to prevent improper exploitation by any parties; The Committee’s inquiry dealing with Disinformation and ‘fake news’ is immensely important, and not only for the UK, as other nations will almost certainly consider the UK’s response to the findings of this inquiry in framing their measures for dealing with this pervasive problem.

AB - The potential damage produced by ‘fake news’ can be significant, and it should be treated as a genuine threat to the proper functioning of democratic societies; The problem of social media users irresponsibly propagating falsehoods, without knowing these are falsehoods, and not making any effort to determine veracity, needs to be dealt with; It does not take much ‘fake news’, under some circumstances, to wholly disrupt consensus forming in a group; Regulatory or other mechanisms that might be introduced to disrupt, interdict or remove ‘fake news’ from social media will confront serious challenges in robustly identifying what is or is not ‘fake news’; The ‘fake news’ problem in mass media is similar in many ways to the quality assurance problems that bedeviled manufacturing industries decades ago. There is a potential payoff in establishing process based quality assurance standards for mass media, and mandating compliance with such standards; A model that should be explored is the exploitation of media organisations with track records of bias free and high integrity news reporting to provide fact checking services to social media organisations; The commonly proposed model of “inoculation”, in which users are taught to think critically and test evidence, may be difficult and expensive to implement, and lazy social media users may simply not bother; In the long term, an educational system that rigorously teaches from the outset critical thinking, the ability to identify logical fallacies, and respect for tested fact over subjective opinion, would solve much of the ‘fake news’ problem as this would destroy much of the ‘market’ for ‘fake news’, but this does not address the near term problem we observe, and will require increased investment in education; A demerit point system uniformly applied across social media platforms, where users are penalized for habitually propagating ‘fake news’ is an alternative to “inoculation”, but confronts the same problems as fact checking – who determines what is or is not ‘fake news’, and is this free of errors and bias? A “silver bullet” solution to this problem may not exist, and strategies similar to those required to defeat biological contagions may be needed. These involve interdicting production and distribution, and “inoculation” of social media users. There is potential for abuse, and potential for free speech to be impaired, where social media are subjected to censorship mechanisms, so any regulatory model has to be designed to prevent improper exploitation by any parties; The Committee’s inquiry dealing with Disinformation and ‘fake news’ is immensely important, and not only for the UK, as other nations will almost certainly consider the UK’s response to the findings of this inquiry in framing their measures for dealing with this pervasive problem.

KW - Social Media

KW - Deception

M3 - Other contribution

PB - House of Commons Digital

CY - United Kingdom

ER -