Loading...








PGA’s vision is to contribute to the creation of a Rules-Based International Order for a more equitable, safe, sustainable and democratic world.

2. Disinformation vs. misinformation: The issue of dangerous speech

Misinformation is false or inaccurate information (e.g., rumors, insults, and pranks). Disinformation is a subset of propaganda, and incorrect information is deliberately spread to deceive people, often sparking fear. This has a severe risk of entrenching polarization and social fractures, which produces more violence and a reluctance from opposing parties to listen to each other to take concerted action. Disinformation, as a result, infringes upon pluralism, which is an essential feature of democracy.

2.1 Types of misinformation and disinformation1

  • Fabricated Content: False content;
  • Manipulated Content: Genuine information or imagery that has been distorted, e.g., a sensational headline or populist ‘clickbait’;
  • Imposter Content: Impersonation of genuine sources, e.g., using the branding of an established agency;
  • Misleading Content: Misleading information, e.g., comment presented as fact;
  • False Context: factual accurate content combined with incorrect contextual information, e.g., when the headline of an article does not reflect the content;
  • Satire and Parody: Humorous but false stories passed off as true. There is no intention to harm, but readers may be fooled;
  • False Connections: When headlines, visuals, or captions do not support the content;
  • Sponsored Content: Advertising or PR disguised as editorial content;
  • Propaganda: Content used to manage attitudes, values, and knowledge;
  • Error: A mistake made by established new agencies in their reporting.

Case studies

Russia

The Kremlin uses disinformation as one of its most important and far-reaching weapons. Russia has operationalized the concept of perpetual adversarial competition in the information environment by encouraging the development of a disinformation and propaganda ecosystem. Said ecosystem creates and spreads false narratives to advance the Kremlin’s policy goals.  Everything from human rights and environmental policy to assassinations and civilian-killing bombing campaigns are fair targets in Russia’s malign playbook:2

Distinctive Features of the Contemporary Model for Russian Propaganda
  • High-volume and multichannel;
  • Rapid, continuous, and repetitive;
  • Lacks commitment to objective reality;
  • Lacks commitment to consistency.

Truth disarms Russia’s disinformation weapons. The Kremlin creates and spreads disinformation to confuse and overwhelm people about Russia’s actions in Ukraine, Georgia, and elsewhere in Europe and Africa.

The difficulty of dealing with disinformation is that source credibility is assessed based on “peripheral cues,” which may or may not conform to the reality of the situation. A broadcast that looks like a news broadcast, even if it is a propaganda broadcast, may have the same degree of credibility in one’s mind as an actual news broadcast.

2.2 How does propaganda undercut perceptions of reality?

  • People are poor judges of true versus false information and do not necessarily remember that particular information was incorrect.
  • Information overload leads people to take shortcuts in determining the trustworthiness of messages.
  • Familiar themes or messages can be appealing, even if they are false.
  • Statements are more likely to be accepted if backed by evidence, even if that evidence is false.
  • Peripheral cues—such as an appearance of objectivity—can increase the credibility of propaganda.

Technology evolves very rapidly, while parliaments and parliamentarians don’t. The bureaucracy within parliaments prevents parliamentarians from processing pertinent information, introducing, debating, and adopting laws in the same period of technological evolution. This gap does not mean that there is nothing to be done.

Legislative responses to disinformation may include:3

  • Reviewing and adapting law-related responses to disinformation to align these with international human rights standards (especially freedom of expression, including access to information and privacy rights) and making provisions for monitoring and evaluation;
  • Developing mechanisms for independent oversight and evaluating the efficacy of relevant legislation, policy, and regulation;
  • Developing tools for independent oversight and evaluating internet communication companies’ practices in fulfilling legal mandates in tackling disinformation;
  • Avoiding the criminalization of disinformation to ensure that legitimate journalism and other public interest information are not caught in the nets of “fake news” laws;
  • Avoiding the disproportionate use of internet shutdowns and social media restrictions as mechanisms to tackle disinformation;
  • Passing legislation or regulation responding to disinformation crises, like the COVID-19 disinfodemic, is necessary, proportionate, and time-limited; and
  • Building an enabling legal environment that can support investment in strengthening independent media, including community and public service media, in the context of the economic impacts of the COVID-19 crisis.

2.3 Social Media and dangerous speech undermining democracy

Disinformation and the proliferation of fake profiles prevent individuals who attack others on social media from being held accountable for dangerous or hate speech. Impunity emboldens perpetrators. There is an ongoing debate between some who support anonymity to protect human rights defenders, dissidents, whistleblowers, and other vulnerable groups to protect other inalienable rights such as privacy and freedom of expression, and some who think that anonymity empowers individuals to use hate speech, misinform, harass, and bully on social media.

Arguments are valid on both sides, but on a practical level, enforcing a “real identity” policy across social media platforms worldwide is technically very challenging and would be detrimental to already marginalized vulnerable populations. Additionally, domestic legislation would not impact comments made in other countries, including by individuals using location-spoofing technology.

From a substantive point of view, laws on incitement to violence or hate speech are different for each country or even, in some, nonexistent.

2.4 Gendered disinformation4

Disinformation can be defined as false information deliberately created to harm a person, social group, organization, or country. The false character of the data can also result from ‘manipulated information’ - disinformation campaigns often rely on accurate, distorted, or emotional content that does not have a truth value. Gendered disinformation then attacks or undermines people based on gender or weaponizes gendered narratives for political, social, or economic objectives.

The concept of gendered disinformation means any false and manipulated information that intends to cause harm to women or people of diverse genders and sexualities. Gendered disinformation campaigns often target individuals with higher public status or positions, such as politicians, CEOs, public advocates, journalists, etc. According to Professor Alana Moceri (IE School of Global and Public Affairs), gendered disinformation delegitimizes women’s participation in political life, undermining democracy and human rights worldwide. Disinformation harms gender-diverse people as it takes on disinformation, which may escalate to hate-based crimes and killings in hostile environments against gender diversity and sexuality.

Rather than directly attacking women’s policy decisions, stereotypical gender characteristics and physical appearance are used to challenge female politicians. It ultimately aims to paint the picture that women are unfit for leadership. It portrays women nominated or assigned higher public offices as unworthy/undeserving/incompetent of such a position, undermining their ability to lead. Consequently, this discourages other women from pursuing political careers or higher positions.

Ana Blatnik’s blog, An Overlooked Threat To Democracy? Gendered Disinformation About Female Politicians5 suggests the following that can be considered as the following steps to addressing this issue:

  • “Find fact-checking websites relevant to your region and topics of interest. For example, if interested in European Union politics, the EU Fact Check looks at the accuracy of political statements about current issues.
  • If available, always check multiple sources on the same topic when reading the news.
  • Look into and support organizations recognizing that gendered disinformation is a problem and advocating for solutions. An example is the EU Disinfo Lab, which has studied and written about gendered disinformation campaigns to highlight the issue.
  • Research ways to bring up the issue to relevant authorities in your country of residence and challenge your public representatives on what they have done to address disinformation and support women politicians who are the targets of disinformation campaigns.
  • Most importantly, continue to educate yourself about gender stereotypes and biases so you can recognize them when interacting with news about women politicians online, especially during election periods. The Women in International Security website has a Resources page that may be a good starting point in that regard.”

On August 7, 2023, at the 78th session of the United Nations General Assembly, the United Nations Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, Ms. Irene Khan, presented the most recent report on “Gendered Disinformation.” The report explores the negative impact of gendered disinformation, especially on women and gender non-conforming persons, and its implications for the right to freedom of expression. The report seeks to clarify the distinct nature of gendered disinformation, which is both a strategy to silence the free expression of women and gender-nonconforming persons and a threat to their safety and health, as well as a possible form of online gender-based violence.

United States6

Under the First Amendment in the U.S. Constitution, inflammatory speech must pass a very high bar to be prosecutable under incitement - immediate and severe risk to a specific identifiable person. The U.S. does not have hate speech laws, and the Supreme Court repeatedly ruled that laws criminalizing hate speech violate the freedom of speech.7

In January 2022, National Public Radio (NPR) conducted a survey, which concluded that 64% of the American population believes that U.S. democracy is in crisis and is at risk of failing. A strong indication that the situation is worsening is that over 70% of respondents in that poll said democracy is more at risk of failure now than it was a year ago.

One of the drivers of decreased confidence in the political system has been the explosion of misinformation deliberately aimed at disrupting the democratic process. This confuses and overwhelms voters. Throughout the 2020 election cycle, Russia’s cyber efforts and online actors influenced public perceptions and sought to amplify mistrust in the electoral process by denigrating mail-in voting, highlighting alleged irregularities, and accusing the Democratic Party of engaging in voter fraud. The “big lie” reinforced by President Trump about the 2020 election results amplified the Russian efforts and has lasting implications on voters’ trust in election outcomes.

The National Intelligence Council has found no indications that any foreign actor has interfered in the technical aspect of voting, such as voter registration, voting and casting ballots, vote tabulation, or reporting election results. However, spreading false information about the voting systems on social media destabilizes the public’s trust in election processes and outcomes.

England

In 2018, the Law Commission (“a non-political independent body, set up by Parliament in 1965 to keep all the law of England and Wales under review and to recommend reform where it is needed”) considered reform for criminal law to protect victims from online abuse. The scoping report noted:

“The anonymity (real or perceived) and the disinhibiting effect of the internet […] can also contribute to people saying and doing things online that they might not do in person in a communication offline. This might include explicit hate speech.”

The report’s focus and recommendations were not anonymous online abuse, just potential changes that could be made to legislation to make the prosecution of identifiable perpetrators easier.

The right to free speech is not absolute; one of its limits is the commission of hate crimes. At face value, a range of public order (and other) offenses to prosecute the behavior can apply equally in the real and digital worlds. Many options would be available while protecting freedom of speech as best as possible.

Tunisia8

Riding a populist wave born of defeated expectations for a post-revolution democracy, President Saied took the helm of the Tunisian government in 2019 with a self-declared mandate to “renew confidence between the people and the rulers.” This spirit guides Saied and his dwindling supporters even now but comes expressed in bitter conspiracies, politically motivated arrests, and blatant lies. Saied, his state, and his supporters understand themselves as warriors against “traitors and the corrupt,” foreign meddlers, and all the other veiled figures who have supposedly “ruined the country during the past decade”—including, most recently, sub-Saharan migrants to the country. In practice, however, the president and his supporters are drivers of cyber-harassment and institutionalized repression of presumed critics, especially those from marginalized identities. Ironically, much of this repression now comes under the auspices of state efforts to fight vaguely defined “rumors” and “fake news” under Decree Law No. 54 of 2022.

Using forensic technology that was in part donated by the United Nations to “fight cybercrime,” President Saied’s government monitors critics online and arrests them for “conspiring against the state” once they publish something that fits the broad language of Saied’s anti-disinformation law, such as criticism of government officials, information about protests, or murmurs of revolution. President Saied then justifies violence against political outgroups by targeting them with conspiracy theories and dehumanizing rhetoric; in one case, Saied even labeled opposition figures a “cancer” that can only be “cured with chemicals.” And every time Saied claims to have broken up these conspiracies by oppressing members of the Tunisian political establishment, he strengthens his devoted base and cements his rule through “demoralization rather than enthusiasm,” convincing opponents that the risk of arrest is either too significant or that the latest conspiracy theory is simply not worth following.

Countries have taken different approaches to addressing disinformation in this and other contexts. Some, like Singapore, have enacted formal legislation; others, such as Argentina, have prosecuted individuals for disseminating fake news as a “crime against public order.” But there has been increased pressure on internet companies, mainly social media platforms, to monitor, identify, and filter “untruthful” content circulating on their networks. The willingness of these companies to accommodate the new demands constitutes a paradigm shift.9

The disinformation dilemma speaks to cultural, political, and legal weaknesses and strengths within each democracy. At the heart of the issue is a crisis of legitimacy among traditional knowledge producers. As legal scholar Jack Balkin writes, “A public sphere doesn’t work properly without trusted and trustworthy institutions guided by professional and public-regarding norms.” He argues that social media companies need to earn and develop that legitimacy while acknowledging that the same standards apply to societal institutions that traditionally have maintained public spheres and now struggle over questions of disinformation.

Politicians use dangerous speech amounting to incitement to violence or hate speech. The consequences are dire and may translate into physical and psychological harm. A prominent example was Trump’s speech on 6 January 2021, urging his supporters to march on the Capitol as Congress was certifying the results of November’s presidential election.

Although most states, including the European Union, reacted slowly to the increased threat of digital foreign election interference, there has been an apparent and needed intensification of efforts to tackle these issues in recent years. The voluntary restrictions taken by social media platforms can go a long way in mitigating the effects of disinformation campaigns. Therefore, it is crucial to work with the platforms and not against them.

Recommendations for parliamentarians10

  • Intensify collaboration between states and platforms in parliamentary committees or hearings;
  • Approve laws and policies that increase transparency, which is crucial in regulating digital disinformation;
  • Removal of manipulated material must be coherent and well-defined. Manipulated material and deep fakes must be addressed to avoid its potential implications on democratic elections. Definitions of illegal should focus on illicit intent and content that has been manipulated to an extent where it is not evident to the viewer;
  • Educational material should accompany legislation and voluntary restrictions; and
  • Allocation of funds to awareness raising and education campaigns.

PGA established a Global Parliamentary Code of Democratic Conduct, a self-regulation mechanism. Legislator-signatories pledge to refrain from propagating misinformation, disinformation, and using dangerous speech. Legislators and civil society alike may use this important accountability tool, which may also serve as a policy roadmap for government institutions, parliamentary committees, or political parties.

The misuse of social media to harass individuals has prevented the full civic participation of sexual and gender minorities. There is a need to increase representation and inclusion of vulnerable and marginalized populations, including women.

2.5 Disinformation campaigns

Case studies

Tunisian 2019 Elections

“In the lead-up to the vote, rumors circulated on Facebook that polling station pens would write in erasable ink and that candidates were withdrawing from the race to support Nabil Karoui, a Tunisian media mogul who faced off with Kais Saied in the second round of voting and who spent almost the entire 2019 campaign season in prison on suspicion of money laundering. At some points, even Karoui’s candidacy was called into doubt, with false reports that he had also withdrawn or been released from prison. In investigative reports after the fact, both Facebook and the DFRLab at the Atlantic Council concluded that coordinated disinformation campaigns had targeted the 2019 Tunisian presidential election. One such campaign, known as “Operation Carthage,” among disinformation researchers, was even carried out by a pro-Karoui Tunisian journalist posing as a neutral fact-checker. Similarly, after internal investigations discovered ties to an Israeli political marketing firm, Facebook removed some supposed “fact-checking” pages targeting Tunisians in May 2019.

Since winning the 2019 election, Tunisian President Kais Saied has only multiplied the power and relevance of disinformation in Tunisia’s fraying democracy. As former President Donald Trump demonstrated in the United States, disinformation from the highest levels of government increases both the cost of fighting it and the damage it causes to democratic institutions. Tunisia’s situation is no different, and the United States and the EU continue to legitimize Saied all the same.”11

Slovakia

“As Slovakia heads toward parliamentary elections on Saturday, the country has been flooded with disinformation, ranging from pro-Russian propaganda, lies about the situation in Ukraine and the spread of anti-migrant hate speech. Slovakia has been coming under a barrage of online disinformation. Days before the parliamentary elections of September 30 that could lead to closer relations between this country of 5.4 million people and Moscow, voters have been flooded with disinformation from home and abroad, especially Russia.

[] Reset, a London-based non-profit, said it had registered more than 365,000 election-related disinformation posts on Slovak social media in the first two weeks of September. According to the non-profit, the posts violating social media terms of service and containing disinformation had generated more than five times as much exposure as an average post.   

The biggest spreaders of disinformation by far are Slovak politicians themselves – rather than influencers, Russian trolls, or conspiracy websites, said Peter Jancarik, co-founder of the Konspiratori.sk project, a public database of Czech and Slovak disinformation websites.    

He said the most significant victory for disinformation websites is not having more online visitors or becoming more popular but seeing their rhetoric taken up by politicians in the public sphere. "For many Slovak politicians, disinformation has become an everyday communication tool."

Former prime minister Robert Fico – a populist with pro-Russian leanings and a favorite in the polls, whose videos are among the most popular in Slovakia on Facebook, YouTube, and Telegram – is a prime example.

"Robert Fico manipulates disinformation to such a degree that even the most intelligent, cultured and well-educated Slovaks find it hard to understand," said Alain Soubigou, senior lecturer in contemporary Central European history at the Sorbonne.

Ahead of the election, Fico and the far-right Republika Milan Uhrik party leader have already warned voters of potential vote-rigging – a strategy also employed by Donald Trump in his failed re-election bid in 2020.     

Much of the disinformation circulating in recent weeks also serves Russian interests. On the campaign trail, Fico has said the war in Ukraine "started in 2014 when Ukrainian Nazis and fascists started murdering Russian citizens in Donbas and Luhansk" in east Ukraine – repeating a false narrative used by Russia’s President Vladimir Putin to justify his invasion.

Along the same lines, Slovak National Party Chairman Andrej Danko said in July that Russian-occupied territories were not “historically Ukrainian.”

“These channels of Russian influence are not entirely new," said Soubigou. "A large part of the political class was formed during the Communist era, and links have existed beyond the Velvet Revolution of 1989" in what was then Czechoslovakia.”12

Hungary

“[] Orban’s ability to hammer his election message home, analysts say, is in large measure due to his government’s tight grip on what most Hungarians, especially those outside Budapest and other major cities, hear on the radio and watch on television.

State-run media marches in lockstep with the government, and many previously independent outlets have been bought up by Orban allies, said Eva Bognar, an academic researcher specializing in Hungarian media.

“There are concrete disinformation campaigns that are familiar to those who study Russian propaganda,” said Bognar, a program officer with Central European University’s Democracy Institute. In recent weeks, she said, “there were two main topics: the war in Ukraine and the election campaign, which was not unrelated to the war.”

In both cases, Bognar said, pro-Orban outlets employed “smear campaigns and disinformation — narratives that are in favor of and produced by the government.”

Orban’s brand of strident nationalism, promises of security, and culture-war fodder such as the demonization of LGBTQ people and Muslim migrants play better to a conservative rural base than it does in Hungary’s cosmopolitan capital, but even in urban areas, he has his devotees.

“He’s a strong guy, and that’s good for all of us!” said butcher Karoly Ludanyi, hefting a string of glistening sausage links a stall in Budapest’s landmark central market hall. “In the EU, they’re too liberal.” The war in Ukraine was unfortunate, he said, but “not our fight.”

Hungary, which has long signaled allegiance with ethnic Hungarians in Ukraine, has taken in tens of thousands of refugees from the war — but Orban’s government has portrayed their presence as posing no threat, a sharp contrast to its vehement objections to those fleeing wars in the Middle East and Afghanistan.

Maria R., a secondary school chemistry teacher in a town outside Budapest, said she suspected that a few of her students, perhaps prodded by pro-Orban parents, seemingly tried to encourage her into making statements critical of government policies. She was sure they would report anything controversial, she said.

Fearing for her job, she said she resolutely kept her political views to herself and asked that her full name not be disclosed but felt saddened and demoralized at the idea that students, some of whom she has known since they were small, would seek to entrap her.”

“I feel like there is a bond that has been broken,” she said […]”13

Taiwan

“[…] Outside observers are often aware of Chinese attempts to influence Taiwan’s presidential elections. However, Taiwanese civil society activists such as Ttcat, co-founder of Doublethink Lab (a digital defense NGO and this author’s institution), identify the island’s local elections as being far more significant targets for China’s disinformation efforts. With Taiwanese voters set to elect candidates to fill over 10,000 offices, the information space is fragmented, which gives China greater scope to spread rumors and conspiracy theories through local communities. China also capitalizes on Taiwanese citizens’ lack of attention toward cross-strait relations in local votes. Chinese actors can intervene on behalf of candidates who espouse the value of closer economic ties with China, without these candidates suffering from fatal accusations that they will cede sovereignty to China. Elected candidates, especially city mayors, are then placed in prominent positions to challenge in the national and presidential elections held a year later. […]

Activists in Taiwan are braced for renewed Chinese efforts to interfere in the November local elections. Doublethink Lab has detected campaigns by Chinese actors to polarize Taiwanese society using false narratives regarding the origins of COVID-19 and incumbent government efforts to manipulate case statistics. Chinese-led disinformation campaigns are increasingly relying on YouTube as a medium for dissemination, the analysis of which calls for human coding at a scale that most watchdogs cannot presently muster. In advance of the November elections, Taiwanese society remains deeply polarized. […]”14


Footnotes:

1 United Nations High Commissioner for Refugees, Using Social Media in Community Based Protection: A Guide, Factsheet 4: Types of Misinformation and Disinformation.

2 Christopher Paul, Miriam Matthews, The Russian "Firehose of Falsehood" Propaganda Model.

3 Broadband Commission for Sustainable Development, Balancing Act: Countering Digital Disinformation While Respecting Freedom of Expression, published by International Telecommunication Union (ITU), September 2020.

4 Internet Governance Forum, Best Practice Forum on Gender and Digital Rights Exploring the concept of gendered disinformation, 2021.

5 Ana Blatnik, An Overlooked Threat To Democracy? Gendered Disinformation About Female Politicians, published by Women in International Security.

6 Gabriel R. Sanchez and Keesha Middlemass, Misinformation is eroding the public’s confidence in democracy, 26 July 2022.

7 Cf. Brandenburg test established in Brandenburg v. Ohio, 395 US 444 (1969).

8 C. Ian DeHaven, Disinformation as a Tool of Regime Survival in Tunisia, 21 July 2023, Arab Center Washington DC.

9 Agustina Del Campo, Disinformation Is Not Simply a Content Moderation Issue, Carnegie Endowment for International Peace, Published October 19, 2021.

10 Christoffer Waldemarsson, Disinformation, Deepfakes & Democracy, published by The Alliance of Democracies Foundation, 27 April 2020.

11 C. Ian DeHaven, Disinformation as a Tool of Regime Survival in Tunisia, published by the Arab Center Washington DC, 21 July 2023.

12 Slovakia swamped by disinformation ahead of parliamentary elections, published on 28 September 2023 by France24.

13 Laura King, Will Hungary’s Orban be the wedge Putin drives between Western allies?, published on 9 April 2022 by the Los Angeles Times.

14 Ben Sando, Taiwan Local Elections Are Where China’s Disinformation Strategies Begin, published on 4 October 2022 by Council of Foreign Relations.

All Chapters in the Parliamentary Toolbox for Democracy Defense: