Dealing with the “Fake News” Problem

Submission to the Joint Standing Committee on Electoral Matters

Carlo Kopp, Kevin Burt Korb, Bruce Mills

Research output: Other contributionOther

Abstract

The potential damage produced by “fake news” can be significant, and it should be treated as a genuine threat to the proper functioning of democratic societies;

The problem of social media users irresponsibly propagating falsehoods, without knowing nor caring that these are falsehoods, and not making any effort to determine veracity or bias, needs to be dealt with;

It does not take much “fake news”, under some circumstances, to wholly disrupt consensus forming in a group. If allowed to propagate unchecked, very little “fake news” can do a huge amount of damage;

Regulatory or other mechanisms that might be introduced to disrupt, interdict or remove “fake news” from social media will confront serious challenges in robustly identifying what is or is not ‘fake news’;

The “fake news” problem in mass media is similar in many ways to the quality assurance problems observed in manufacturing industries decades ago. Mandatory process based quality assurance standards for mass media could address much of this problem, as quality assurance problems in other sectors have been resolved;

A model that should be explored is the exploitation of media organisations with track records of bias free and high integrity news reporting to provide fact checking services to social media organisations;

The commonly proposed model of “inoculation”, in which users are taught to think critically and test evidence for veracity and bias, may be difficult and expensive to implement, and lazy social media users may simply not bother;

The long-­‐term solution is a robust educational system that rigorously teaches critical thinking from the very outset, starting with primary schools. Students will need to learn the ability to test source credibility, identify reasoning errors, and develop respect for tested fact over subjective opinion. This would solve much of the “fake news” problem as it would destroy most of the ‘market’ for “fake news”;

A demerit point system uniformly applied across social media platforms, where users are penalized for habitually propagating “fake news” is an alternative to “inoculation”, but confronts the same problems as fact checking – who determines what is or is not “fake news”, and is this free of errors and bias?

There may not exist any “silver bullet” solution to this problem, and comprehensive strategies similar to those required to defeat biological contagions may be needed. These will involve interdicting “fake news” production and distribution, and “inoculation” of social media users.

There is genuine potential for abuse, and potential for free speech to be impaired, where social media are subjected to censorship mechanisms, so any regulatory models considered will have to be carefully designed and tested to prevent improper exploitation by any parties;
Original languageEnglish
TypeSubmission to Parliamentart Committee
Media of outputWebsite, Parliament House, Canberra
PublisherParliament of Australia
Number of pages17
Place of PublicationParliament House, Canberra
EditionN/A
VolumeN/A
ISBN (Print)N/A
ISBN (Electronic)N/A
Publication statusPublished - 8 Feb 2019

Keywords

  • Social Media
  • Deception

Cite this

Kopp, C., Korb, K. B., & Mills, B. (2019, Feb 8). Dealing with the “Fake News” Problem: Submission to the Joint Standing Committee on Electoral Matters. (N/A ed.) Parliament House, Canberra: Parliament of Australia.
Kopp, Carlo ; Korb, Kevin Burt ; Mills, Bruce. / Dealing with the “Fake News” Problem : Submission to the Joint Standing Committee on Electoral Matters. 2019. Parliament House, Canberra : Parliament of Australia. 17 p.
@misc{0c6989107f824766b98b2010e77189ee,
title = "Dealing with the “Fake News” Problem: Submission to the Joint Standing Committee on Electoral Matters",
abstract = "The potential damage produced by “fake news” can be significant, and it should be treated as a genuine threat to the proper functioning of democratic societies; The problem of social media users irresponsibly propagating falsehoods, without knowing nor caring that these are falsehoods, and not making any effort to determine veracity or bias, needs to be dealt with; It does not take much “fake news”, under some circumstances, to wholly disrupt consensus forming in a group. If allowed to propagate unchecked, very little “fake news” can do a huge amount of damage; Regulatory or other mechanisms that might be introduced to disrupt, interdict or remove “fake news” from social media will confront serious challenges in robustly identifying what is or is not ‘fake news’; The “fake news” problem in mass media is similar in many ways to the quality assurance problems observed in manufacturing industries decades ago. Mandatory process based quality assurance standards for mass media could address much of this problem, as quality assurance problems in other sectors have been resolved; A model that should be explored is the exploitation of media organisations with track records of bias free and high integrity news reporting to provide fact checking services to social media organisations; The commonly proposed model of “inoculation”, in which users are taught to think critically and test evidence for veracity and bias, may be difficult and expensive to implement, and lazy social media users may simply not bother; The long-­‐term solution is a robust educational system that rigorously teaches critical thinking from the very outset, starting with primary schools. Students will need to learn the ability to test source credibility, identify reasoning errors, and develop respect for tested fact over subjective opinion. This would solve much of the “fake news” problem as it would destroy most of the ‘market’ for “fake news”; A demerit point system uniformly applied across social media platforms, where users are penalized for habitually propagating “fake news” is an alternative to “inoculation”, but confronts the same problems as fact checking – who determines what is or is not “fake news”, and is this free of errors and bias? There may not exist any “silver bullet” solution to this problem, and comprehensive strategies similar to those required to defeat biological contagions may be needed. These will involve interdicting “fake news” production and distribution, and “inoculation” of social media users. There is genuine potential for abuse, and potential for free speech to be impaired, where social media are subjected to censorship mechanisms, so any regulatory models considered will have to be carefully designed and tested to prevent improper exploitation by any parties;",
keywords = "Social Media, Deception",
author = "Carlo Kopp and Korb, {Kevin Burt} and Bruce Mills",
note = "Parliamentary Inquiry Submission",
year = "2019",
month = "2",
day = "8",
language = "English",
isbn = "N/A",
volume = "N/A",
publisher = "Parliament of Australia",
edition = "N/A",
type = "Other",

}

Kopp, C, Korb, KB & Mills, B 2019, Dealing with the “Fake News” Problem: Submission to the Joint Standing Committee on Electoral Matters. Parliament of Australia, Parliament House, Canberra.

Dealing with the “Fake News” Problem : Submission to the Joint Standing Committee on Electoral Matters. / Kopp, Carlo; Korb, Kevin Burt; Mills, Bruce.

17 p. N/A ed. Parliament House, Canberra : Parliament of Australia. 2019, Submission to Parliamentart Committee.

Research output: Other contributionOther

TY - GEN

T1 - Dealing with the “Fake News” Problem

T2 - Submission to the Joint Standing Committee on Electoral Matters

AU - Kopp, Carlo

AU - Korb, Kevin Burt

AU - Mills, Bruce

N1 - Parliamentary Inquiry Submission

PY - 2019/2/8

Y1 - 2019/2/8

N2 - The potential damage produced by “fake news” can be significant, and it should be treated as a genuine threat to the proper functioning of democratic societies; The problem of social media users irresponsibly propagating falsehoods, without knowing nor caring that these are falsehoods, and not making any effort to determine veracity or bias, needs to be dealt with; It does not take much “fake news”, under some circumstances, to wholly disrupt consensus forming in a group. If allowed to propagate unchecked, very little “fake news” can do a huge amount of damage; Regulatory or other mechanisms that might be introduced to disrupt, interdict or remove “fake news” from social media will confront serious challenges in robustly identifying what is or is not ‘fake news’; The “fake news” problem in mass media is similar in many ways to the quality assurance problems observed in manufacturing industries decades ago. Mandatory process based quality assurance standards for mass media could address much of this problem, as quality assurance problems in other sectors have been resolved; A model that should be explored is the exploitation of media organisations with track records of bias free and high integrity news reporting to provide fact checking services to social media organisations; The commonly proposed model of “inoculation”, in which users are taught to think critically and test evidence for veracity and bias, may be difficult and expensive to implement, and lazy social media users may simply not bother; The long-­‐term solution is a robust educational system that rigorously teaches critical thinking from the very outset, starting with primary schools. Students will need to learn the ability to test source credibility, identify reasoning errors, and develop respect for tested fact over subjective opinion. This would solve much of the “fake news” problem as it would destroy most of the ‘market’ for “fake news”; A demerit point system uniformly applied across social media platforms, where users are penalized for habitually propagating “fake news” is an alternative to “inoculation”, but confronts the same problems as fact checking – who determines what is or is not “fake news”, and is this free of errors and bias? There may not exist any “silver bullet” solution to this problem, and comprehensive strategies similar to those required to defeat biological contagions may be needed. These will involve interdicting “fake news” production and distribution, and “inoculation” of social media users. There is genuine potential for abuse, and potential for free speech to be impaired, where social media are subjected to censorship mechanisms, so any regulatory models considered will have to be carefully designed and tested to prevent improper exploitation by any parties;

AB - The potential damage produced by “fake news” can be significant, and it should be treated as a genuine threat to the proper functioning of democratic societies; The problem of social media users irresponsibly propagating falsehoods, without knowing nor caring that these are falsehoods, and not making any effort to determine veracity or bias, needs to be dealt with; It does not take much “fake news”, under some circumstances, to wholly disrupt consensus forming in a group. If allowed to propagate unchecked, very little “fake news” can do a huge amount of damage; Regulatory or other mechanisms that might be introduced to disrupt, interdict or remove “fake news” from social media will confront serious challenges in robustly identifying what is or is not ‘fake news’; The “fake news” problem in mass media is similar in many ways to the quality assurance problems observed in manufacturing industries decades ago. Mandatory process based quality assurance standards for mass media could address much of this problem, as quality assurance problems in other sectors have been resolved; A model that should be explored is the exploitation of media organisations with track records of bias free and high integrity news reporting to provide fact checking services to social media organisations; The commonly proposed model of “inoculation”, in which users are taught to think critically and test evidence for veracity and bias, may be difficult and expensive to implement, and lazy social media users may simply not bother; The long-­‐term solution is a robust educational system that rigorously teaches critical thinking from the very outset, starting with primary schools. Students will need to learn the ability to test source credibility, identify reasoning errors, and develop respect for tested fact over subjective opinion. This would solve much of the “fake news” problem as it would destroy most of the ‘market’ for “fake news”; A demerit point system uniformly applied across social media platforms, where users are penalized for habitually propagating “fake news” is an alternative to “inoculation”, but confronts the same problems as fact checking – who determines what is or is not “fake news”, and is this free of errors and bias? There may not exist any “silver bullet” solution to this problem, and comprehensive strategies similar to those required to defeat biological contagions may be needed. These will involve interdicting “fake news” production and distribution, and “inoculation” of social media users. There is genuine potential for abuse, and potential for free speech to be impaired, where social media are subjected to censorship mechanisms, so any regulatory models considered will have to be carefully designed and tested to prevent improper exploitation by any parties;

KW - Social Media

KW - Deception

UR - https://www.aph.gov.au/DocumentStore.ashx?id=eb43c34e-f922-488d-bb46-bf7a41abfaf2&subId=665089

M3 - Other contribution

SN - N/A

VL - N/A

PB - Parliament of Australia

CY - Parliament House, Canberra

ER -