disinformation attack

{{short description|Coordinated dissemination of false information}}

Disinformation attacks are strategic deception campaigns{{Cite journal |last1=Bennett |first1=W Lance |last2=Livingston |first2=Steven |date=April 2018 |title=The disinformation order: Disruptive communication and the decline of democratic institutions |url=http://journals.sagepub.com/doi/10.1177/0267323118760317 |journal=European Journal of Communication |language=en |volume=33 |issue=2 |pages=122–139 |doi=10.1177/0267323118760317 |s2cid=149557690 |issn=0267-3231|url-access=subscription }} involving media manipulation and internet manipulation, to disseminate misleading information,{{cite journal |last1=Wardle |first1=Claire |date=1 April 2023 |title=Misunderstanding Misinformation |journal=Issues in Science and Technology |volume=29 |issue=3 |pages=38–40 |doi=10.58875/ZAUD1691 |s2cid=257999777 |doi-access=free}} aiming to confuse, paralyze, and polarize an audience.{{Cite journal|last=Fallis|first=Don|date=2015|title=What Is Disinformation?|url=https://muse.jhu.edu/content/crossref/journals/library_trends/v063/63.3.fallis.html|journal=Library Trends|language=en|volume=63|issue=3|pages=401–426|doi=10.1353/lib.2015.0014|hdl=2142/89818|s2cid=13178809|issn=1559-0682|hdl-access=free}} Disinformation can be considered an attack when it involves orchestrated and coordinated efforts{{Cite book |last=Diaz Ruiz |first=Carlos |url=https://www.taylorfrancis.com/books/9781003506676 |title=Market-Oriented Disinformation Research: Digital Advertising, Disinformation and Fake News on Social Media |date=2025-03-14 |publisher=Routledge |isbn=978-1-003-50667-6 |edition=1 |location=London |pages=29 |language=en |chapter=The Polysemy of Disinformation: Definitions, Meanings, and Contradictions in Disinformation Research |doi=10.4324/9781003506676-2}} to build an adversarial narrative campaign that weaponizes multiple rhetorical strategies and forms of knowing—including not only falsehoods but also truths, half-truths, and value-laden judgements—to exploit and amplify identity-driven controversies.{{cite journal |last1=Diaz Ruiz |first1=Carlos |last2=Nilsson |first2=Tomas |date=2023 |title=Disinformation and Echo Chambers: How Disinformation Circulates on Social Media Through Identity-Driven Controversies |url=https://doi.org/10.1177/07439156221103852 |journal=Journal of Public Policy & Marketing |volume=42 |issue=1|pages=18–35 |doi=10.1177/07439156221103852 |s2cid=248934562 }} Disinformation attacks use media manipulation to target broadcast media like state-sponsored TV channels and radios.{{Cite journal |last1=Ajir |first1=Media |last2=Vailliant |first2=Bethany |date=2018 |title=Russian Information Warfare: Implications for Deterrence Theory |url=https://www.jstor.org/stable/26481910 |journal=Strategic Studies Quarterly |volume=12 |issue=3 |pages=70–89 |issn=1936-1815 |jstor=26481910}}{{cite web |last1=McKay |first1=Gillian |date=22 June 2022 |title=Disinformation and Democratic Transition: A Kenyan Case Study |url=https://www.stimson.org/2022/disinformation-and-democratic-transition-a-kenyan-case-study/ |website=Stimson Center}} Due to the increasing use of internet manipulation on social media, they can be considered a cyber threat.{{Cite book |last=Caramancion |first=Kevin Matthe |title=2020 3rd International Conference on Information and Computer Technologies (ICICT) |date=March 2020 |isbn=978-1-7281-7283-5 |pages=440–444 |chapter=An Exploration of Disinformation as a Cybersecurity Threat |doi=10.1109/ICICT50521.2020.00076 |chapter-url=https://ieeexplore.ieee.org/document/9092330 |s2cid=218651389}}{{Cite journal |last=Downes |first=Cathy |date=2018 |title=Strategic Blind–Spots on Cyber Threats, Vectors and Campaigns |url=https://www.jstor.org/stable/26427378 |journal=The Cyber Defense Review |volume=3 |issue=1 |pages=79–104 |issn=2474-2120 |jstor=26427378}} Digital tools such as bots, algorithms, and AI technology, along with human agents including influencers, spread and amplify disinformation to micro-target populations on online platforms like Instagram, Twitter, Google, Facebook, and YouTube.{{Cite journal |last=Katyal |first=Sonia K. |date=2019 |title=Artificial Intelligence, Advertising, and Disinformation |url=https://muse.jhu.edu/article/745987 |journal=Advertising & Society Quarterly |language=en |volume=20 |issue=4 |doi=10.1353/asr.2019.0026 |issn=2475-1790 |s2cid=213397212|url-access=subscription }}

According to a 2018 report by the European Commission,{{Cite web |date=2018-04-26 |title=Communication - Tackling online disinformation: a European approach |url=https://digital-strategy.ec.europa.eu/en/library/communication-tackling-online-disinformation-european-approach |access-date=2023-11-15 |website=European Commission |language=en}} disinformation attacks can pose threats to democratic governance, by diminishing the legitimacy of the integrity of electoral processes. Disinformation attacks are used by and against governments, corporations, scientists, journalists, activists, and other private individuals.{{cite web |title=Disinformation attacks have arrived in the corporate sector. Are you ready? |url=https://www.pwc.com/us/en/tech-effect/cybersecurity/corporate-sector-disinformation.html |access-date=6 December 2022 |website=PwC |language=en-us}} These attacks are commonly employed to reshape attitudes and beliefs, drive a particular agenda, or elicit certain actions from a target audience. Tactics include circulating incorrect or misleading information, creating uncertainty, and undermining the legitimacy of official information sources.{{Cite journal |last1=Collado |first1=Zaldy C. |last2=Basco |first2=Angelica Joyce M. |last3=Sison |first3=Albin A. |date=2020-06-26 |title=Falling victims to online disinformation among young Filipino people: Is human mind to blame? |url=http://www.cbbjournal.ro/index.php/en/2020/130-24-2/661-falling-victims-to-online-disinformation-among-young-filipino-people-is-human-mind-to-blame |journal=Cognition, Brain, Behavior |volume=24 |issue=2 |pages=75–91 |doi=10.24193/cbb.2020.24.05 |s2cid=225786653|url-access=subscription }}

An emerging area of disinformation research focuses on the countermeasures to disinformation attacks. Technologically, defensive measures include machine learning applications and blockchain technologies that can flag disinformation on digital platforms.{{Cite journal |last1=Butincu |first1=Cristian Nicolae |last2=Alexandrescu |first2=Adrian |date=2023 |title=Blockchain-Based Platform to Fight Disinformation Using Crowd Wisdom and Artificial Intelligence |journal=Applied Sciences |volume=13 |issue=10 |page=6088 |doi=10.3390/app13106088 |doi-access=free |s2cid=258758377 |issn=2076-3417}} Socially, educational programs are being developed to teach people how to better discern between facts and disinformation online. Journalists publish recommendations for assessing sources. Commercially, revisions to algorithms, advertising, and influencer practices on digital platforms are proposed. Individual interventions include actions that can be taken by individuals to improve their own skills in dealing with information (e.g., media literacy), and individual actions to challenge disinformation.

Goals

Disinformation attacks involve the intentional spreading of false information, with end goals of misleading, confusing, and encouraging violence,{{cite journal | last1=Brancati | first1=Dawn | last2=Penn | first2=Elizabeth M | title=Stealing an Election: Violence or Fraud? | journal=Journal of Conflict Resolution | volume=67 | issue=5 | date=2023 | issn=0022-0027 | doi=10.1177/00220027221120595 | pages=858–892}} and gaining money, power, or reputation. Disinformation attacks may involve political, economic, and individual actors. They may attempt to influence attitudes and beliefs, drive a specific agenda, get people to act in specific ways, or destroy credibility of individuals or institutions. The presentation of incorrect information may be the most obvious part of a disinformation attack, but it is not the only purpose. The creation of uncertainty and the undermining of both correct information and the credibility of information sources are often intended as well.{{Cite book|last=Frederick|first=Kara|date=2019|title=The New War of Ideas: Counterterrorism Lessons for the Digital Disinformation Fight|publisher=Center for a New American Security|url=https://www.jstor.org/stable/resrep20399}}

= Convincing people to believe incorrect information =

File:Disinformation and echo chambers.jpg

If individuals can be convinced of something that is factually incorrect, they may make decisions that will run counter to the best interests of themselves and those around them. If the majority of people in a society can be convinced of something that is factually incorrect, the misinformation may lead to political and social decisions that are not in the best interest of that society. This can have serious impacts at both individual and societal levels.{{cite journal |last1=Lewandowsky |first1=Stephan |last2=Ecker |first2=Ullrich K. H. |last3=Seifert |first3=Colleen M. |last4=Schwarz |first4=Norbert |last5=Cook |first5=John |date=December 2012 |title=Misinformation and Its Correction: Continued Influence and Successful Debiasing |journal=Psychological Science in the Public Interest |language=en |volume=13 |issue=3 |pages=106–131 |doi=10.1177/1529100612451018 |issn=1529-1006 |pmid=26173286 |s2cid=42633 |doi-access=free }}

In the 1990s, a British doctor who held a patent on a single-shot measles vaccine promoted distrust of combined MMR vaccine. His fraudulent claims were meant to promote sales of his own vaccine. The subsequent media frenzy increased fear and many parents chose not to immunize their children.{{cite journal |last1=Davidson |first1=M |title=Vaccination as a cause of autism-myths and controversies. |journal=Dialogues in Clinical Neuroscience |date=December 2017 |volume=19 |issue=4 |pages=403–407 |doi=10.31887/DCNS.2017.19.4/mdavidson |pmid=29398935 |pmc=5789217 }}

This was followed by a significant increase in cases, hospitalizations and deaths that would have been preventable by the MMR vaccine.{{cite magazine |last1=Quick |first1=Jonathan D. |last2=Larson |first2=Heidi |title=The Vaccine-Autism Myth Started 20 Years Ago. It Still Endures Today |url=https://time.com/5175704/andrew-wakefield-vaccine-autism/ |access-date=18 January 2023 |magazine=Time |date=February 28, 2018 |language=en}}{{cite journal |last1=Dubé |first1=Ève |last2=Ward |first2=Jeremy K. |last3=Verger |first3=Pierre |last4=MacDonald |first4=Noni E. |title=Vaccine Hesitancy, Acceptance, and Anti-Vaccination: Trends and Future Prospects for Public Health |journal=Annual Review of Public Health |date=1 April 2021 |volume=42 |issue=1 |pages=175–191 |doi=10.1146/annurev-publhealth-090419-102240 |pmid=33798403 |s2cid=232774243 |language=en |issn=0163-7525|doi-access=free }} It also led to the expenditure of substantial money on follow-up research that tested the assertions made in the disinformation,{{cite journal |last1=Gerber |first1=JS |last2=Offit |first2=PA |title=Vaccines and autism: a tale of shifting hypotheses. |journal=Clinical Infectious Diseases |date=15 February 2009 |volume=48 |issue=4 |pages=456–61 |doi=10.1086/596476 |pmid=19128068 |pmc=2908388 }} and on public information campaigns attempting to correct the disinformation. The fraudulent claim continues to be referenced and to increase vaccine hesitancy.{{cite journal |last1=Pluviano |first1=S |last2=Watt |first2=C |last3=Della Sala |first3=S |title=Misinformation lingers in memory: Failure of three pro-vaccination strategies. |journal=PLOS ONE |date=2017 |volume=12 |issue=7 |pages=e0181640 |doi=10.1371/journal.pone.0181640 |pmid=28749996 |pmc=5547702 |bibcode=2017PLoSO..1281640P |doi-access=free }}

In the case of the 2020 United States presidential election, disinformation was used in an attempt to convince people to believe something that was not true and change the outcome of the election.{{cite journal |last1=Henricksen |first1=Wes |title=Disinformation and the First Amendment: Fraud on the Public |journal=St. John's Law Review |date=13 June 2023 |volume=96 |issue=3 |pages=543–589 |url=https://scholarship.law.stjohns.edu/lawreview/vol96/iss3/3/}}{{cite news |title=Exhaustive fact check finds little evidence of voter fraud, but 2020's 'Big Lie' lives on |url=https://www.pbs.org/newshour/show/exhaustive-fact-check-finds-little-evidence-of-voter-fraud-but-2020s-big-lie-lives-on |access-date=19 January 2023 |work=PBS NewsHour |date=17 December 2021 |language=en-us}} Repeated disinformation messages about the possibility of election fraud were introduced years before the actual election occurred, as early as 2016.{{cite news |last1=Kuznia |first1=Rob |last2=Devine |first2=Curt |last3=Black |first3=Nelli |last4=Griffin |first4=Drew |title=Stop the Steal's massive disinformation campaign connected to Roger Stone {{!}} CNN Business |url=https://www.cnn.com/2020/11/13/business/stop-the-steal-disinformation-campaign-invs/index.html |access-date=19 January 2023 |work=CNN |date=14 November 2020 |language=en}}{{cite web |title=Foreign Threats to the 2020 US Federal Elections |url=https://www.dni.gov/files/ODNI/documents/assessments/ICA-declass-16MAR21.pdf |website=Intelligence Committee Assessment|date=10 March 2021 |access-date=27 January 2023}} Researchers found that much of the fake news originated in domestic right-wing groups. The nonpartisan Election Integrity Partnership reported prior to the election that "What we're seeing right now are essentially seeds being planted, dozens of seeds each day, of false stories... They're all being planted such that they could be cited and reactivated ... after the election." Groundwork was laid through multiple and repeated disinformation attacks for claims that voting was unfair and to delegitimize the results of the election once it occurred.{{cite journal |last1=Miller |first1=Greg |title=As U.S. election nears, researchers are following the trail of fake news |journal=Science |date=26 Oct 2020 |url=https://www.science.org/content/article/us-election-nears-researchers-are-following-trail-fake-news |access-date=19 January 2023 |language=en}} Although the 2020 United States presidential election results were upheld, some people still believe the "big lie".

People who get information from a variety of news sources, not just sources from a particular viewpoint, are more likely to detect disinformation.{{cite news |last1=Atske |first1=Sara |title=3. Misinformation and competing views of reality abounded throughout 2020 |url=https://www.pewresearch.org/journalism/2021/02/22/misinformation-and-competing-views-of-reality-abounded-throughout-2020/ |access-date=19 January 2023 |work=Pew Research Center's Journalism Project |date=22 February 2021}} Tips for detecting disinformation include reading reputable news sources at a local or national level, rather than relying on social media. Beware of sensational headlines that are intended to attract attention and arouse emotion. Fact-check information broadly, not just on one usual platform or among friends. Check the original source of the information. Ask what was really said, who said it, and when. Consider possible agendas or conflicts of interest on the part of the speaker or those passing along the information.{{cite news |last1=Torres |first1=Emily |title=How To Thoughtfully Fact-Check Your Media Consumption |url=https://www.thegoodtrade.com/features/media-bias-fact-check/ |work=The Good Trade |date=25 May 2022}}{{cite news |last1=Lee |first1=Jenna Marina |title=How Fake News Affects U.S. Elections {{!}} University of Central Florida News |url=https://www.ucf.edu/news/how-fake-news-affects-u-s-elections/ |access-date=19 January 2023 |work=University of Central Florida News {{!}} UCF Today |date=26 October 2020 |language=en-us}}{{cite news |last1=Gebel |first1=Meira |date=January 15, 2021 |title=Misinformation vs. disinformation: What to know about each form of false information, and how to spot them online |work=Business Insider |url=https://www.businessinsider.com/guides/tech/misinformation-vs-disinformation |access-date=25 January 2023}}

= Undermining correct information =

Sometimes undermining belief in correct information is a more important goal of disinformation than convincing people to hold a new belief. In the case of combined MMR vaccines, disinformation was originally intended to convince people of a specific fraudulent claim and by doing so promote sales of a competing product. However, the impact of the disinformation became much broader. The fear that one type of vaccine might pose a danger fueled general fears that vaccines might pose a risk. Rather than convincing people to choose one product over another, belief in a whole area of medical research was eroded.

= Creation of uncertainty =

There is widespread agreement that disinformation is spreading confusion.{{cite news |last1=Barthel |first1=Michael |title=Many Americans Believe Fake News Is Sowing Confusion |url=https://www.pewresearch.org/journalism/2016/12/15/many-americans-believe-fake-news-is-sowing-confusion/ |access-date=18 January 2023 |work=Pew Research Center's Journalism Project |date=15 December 2016}}

This is not just a side effect; confusing and overwhelming people is an intentional objective.{{cite journal |last1=Choi |first1=Jihyang |last2=Lee |first2=Jae Kook |title=Confusing Effects of Fake News on Clarity of Political Information in the Social Media Environment |journal=Journalism Practice |date=26 November 2022 |volume=16 |issue=10 |pages=2147–2165 |doi=10.1080/17512786.2021.1903971 |s2cid=233705384 |url=https://www.tandfonline.com/doi/abs/10.1080/17512786.2021.1903971 |access-date=18 January 2023 |issn=1751-2786|url-access=subscription }} Whether disinformation attacks are used against political opponents or "commercially inconvenient science", they sow doubt and uncertainty as a way of undermining support for an opposing position and preventing effective action.

A 2016 paper describes social media-driven political disinformation tactics as a "firehose of falsehood" that "entertains, confuses and overwhelms the audience." Four characteristics were illustrated with respect to Russian propaganda. Disinformation is used in a way that is 1) high-volume and multichannel 2) continuous and repetitive 3) ignores objective reality and 4) ignores consistency. It becomes effective by creating confusion and obscuring, disrupting and diminishing the truth. When one falsehood is exposed, "the propagandists will discard it and move on to a new (though not necessarily more plausible) explanation."{{cite journal |last1=Paul |first1=Christopher |last2=Matthews |first2=Miriam |title=The Russian "Firehose of Falsehood" Propaganda Model: Why It Might Work and Options to Counter It |url=https://www.rand.org/pubs/perspectives/PE198.html |website=RAND Corporation |access-date=23 January 2023 |language=en |date=11 July 2016}} The purpose is not to convince people of a specific narrative, but to "Deny, deflect, distract".

Countering this is difficult, in part because "It takes less time to make up facts than it does to verify them." There is evidence that false information "cascades" travel farther, faster, and more broadly than truthful information, perhaps due to novelty and emotional loading.{{cite journal |last1=Vosoughi |first1=Soroush |last2=Roy |first2=Deb |last3=Aral |first3=Sinan |title=The spread of true and false news online |journal=Science |date=9 March 2018 |volume=359 |issue=6380 |pages=1146–1151 |doi=10.1126/science.aap9559 |pmid=29590045 |bibcode=2018Sci...359.1146V |s2cid=4549072 |language=en |issn=0036-8075|doi-access=free }}

Trying to fight a many-headed hydra of disinformation may be less effective than raising awareness of how disinformation works and how to identify it, before an attack occurs. For example, Ukraine was able to warm citizens and journalists about the potential use of state-sponsored deepfakes in advance of an actual attack, which likely slowed its spread.{{cite news |last1=Allyn |first1=Bobby |title=Deepfake video of Zelenskyy could be 'tip of the iceberg' in info war, experts warn |url=https://www.npr.org/2022/03/16/1087062648/deepfake-video-zelenskyy-experts-war-manipulation-ukraine-russia |access-date=25 January 2023 |work=NPR |date=March 16, 2022}}

Another way to counter disinformation is to focus on identifying and countering its real objective. For example, if disinformation is trying to discourage voters, find ways to empower voters and elevate authoritative information about when, where and how to vote.{{cite news |last1=Heffner |first1=Alexander |last2=Miller |first2=Alan C. |title=We're launching an election-season ad campaign to fight fake news, and we need your help |url=https://www.usatoday.com/story/opinion/2020/09/13/fight-fake-news-protect-voting-democracy-fact-based-future-column/3460321001/ |access-date=23 January 2023 |work=USA TODAY |date=September 13, 2020}} If claims of voter fraud are being put forward, provide clear messaging about how the voting process occurs, and refer people back to reputable sources that can address their concerns.{{cite news |last1=Bond |first1=Shannon |last2=Parks |first2=Miles |last3=Jingnan |first3=Huo |title=Election officials feared the worst. Here's why baseless claims haven't fueled chaos |url=https://www.npr.org/2022/11/14/1136537352/2022-election-how-voting-went-misinformation |access-date=23 January 2023 |work=All Things Considered |date=November 14, 2022}}

= Undermining of trust =

Disinformation involves more than just a competition between inaccurate and accurate information. Disinformation, rumors and conspiracy theories call into question underlying trust at multiple levels. Undermining of trust can be directed at scientists, governments and media and have very real consequences. Public trust in science is essential to the work of policymakers and to good governance, particularly for issues in medicine, public health, and the environmental sciences. It is essential that individuals, organizations and governments have access to accurate information when making decisions.{{cite journal |last1=Gundersen |first1=Torbjørn |last2=Alinejad |first2=Donya |last3=Branch |first3=T.Y. |last4=Duffy |first4=Bobby |last5=Hewlett |first5=Kirstie |last6=Holst |first6=Cathrine |last7=Owens |first7=Susan |last8=Panizza |first8=Folco |last9=Tellmann |first9=Silje Maria |last10=van Dijck |first10=José |last11=Baghramian |first11=Maria |date=17 October 2022 |title=A New Dark Age? Truth, Trust, and Environmental Science |url=https://www.annualreviews.org/doi/full/10.1146/annurev-environ-120920-015909 |journal=Annual Review of Environment and Resources |volume=47 |issue=1 |pages=5–29 |doi=10.1146/annurev-environ-120920-015909 |s2cid=250659393 |hdl-access=free |hdl=10852/99734}}

An example is disinformation around COVID-19 vaccines. Disinformation has targeted the products themselves, the researchers and organizations who develop them, the healthcare professionals and organizations who administer them, and the policy-makers that have supported their development and advised their use.{{cite journal |last1=Pertwee |first1=Ed |last2=Simas |first2=Clarissa |last3=Larson |first3=Heidi J. |title=An epidemic of uncertainty: rumors, conspiracy theories and vaccine hesitancy |journal=Nature Medicine |date=March 2022 |volume=28 |issue=3 |pages=456–459 |doi=10.1038/s41591-022-01728-z |pmid=35273403 |s2cid=247385552 |language=en |issn=1546-170X|doi-access=free }}{{cite book |last1=Barseghyan |first1=Arshaluys |last2=Grigoryan |first2=Lusine |last3=Pambukhchyan |first3=Anna |last4=Papyan |first4=Artur |title=Disinformation and Misinformation in Armenia: Confronting the Power of False Narratives |date=June 2021 |publisher=Freedom House |url=https://freedomhouse.org/sites/default/files/2021-06/Disinformation-in-Armenia_En-v3.pdf}}{{cite news |last1=Boyle |first1=Patrick |title=Why do so many Americans distrust science? |url=https://www.aamc.org/news-insights/why-do-so-many-americans-distrust-science |access-date=25 January 2023 |work=AAMC News |date=May 4, 2022 |language=en}}

Countries where citizens had higher levels of trust in society and government appear to have mobilized more effectively against the virus, as measured by slower virus spread and lower mortality rates.{{cite news |last1=Chew |first1=Bruce |last2=Flynn |first2=Michael |last3=Black |first3=Georgina |last4=Gupta |first4=Rajiv |title=Sustaining public trust in government |url=https://www2.deloitte.com/us/en/insights/industry/public-sector/government-trends/2021/public-trust-in-government.html |access-date=25 January 2023 |work=Deloitte Insights |date=4 March 2021 |language=en-us}}

Studies of people's beliefs about the amount of disinformation and misinformation in the news media suggest that distrust of traditional news media tends to be associated with reliance on alternate information sources such as social media. Structural support for press freedoms, a stronger independent press, and evidence of the credibility and honesty of the press can help to restore trust in traditional media as a provider of independent, honest, and transparent information.{{cite journal |last1=Hameleers |first1=Michael |last2=Brosius |first2=Anna |last3=de Vreese |first3=Claes H |title=Whom to trust? Media exposure patterns of citizens with perceptions of misinformation and disinformation related to the news media |journal=European Journal of Communication |date=June 2022 |volume=37 |issue=3 |pages=237–268 |doi=10.1177/02673231211072667 |s2cid=246785459 |language=en |issn=0267-3231|doi-access=free }}

= Undermining of credibility =

A major tactic of disinformation is to attack and attempt to undermine the credibility of people and organizations who are in a position to oppose the disinformation narrative due to their research or position of authority. This can include politicians, government officials, scientists, journalists, activists, human rights defenders and others.{{cite book |last1=Nyst |first1=Carly |last2=Monaco |first2=Nick |title=STATE-SPONSORED TROLLING How Governments Are Deploying Disinformation as Part of Broader Digital Harassment Campaigns |date=2018 |publisher=Institute for the Future |location=Palo Alto, CA |url=https://cgt.columbia.edu/wp-content/uploads/2018/11/Politics-of-Visual-Arts-Recommended-Reading.pdf |access-date=21 January 2023}}

For example, a New Yorker report in 2023 revealed details about the campaign run by the UAE, under which the Emirati President Mohamed bin Zayed paid millions of euros to a Swiss businessman, Mario Brero, for "dark PR" against their targets. Brero and his company Alp Services used the UAE money to create damning Wikipedia entries and publish propaganda articles against Qatar and those with ties to the Muslim Brotherhood. Targets included the company Lord Energy, which eventually declared bankruptcy following unproven allegations of links to terrorism.{{cite magazine |url=https://www.newyorker.com/magazine/2023/04/03/the-dirty-secrets-of-a-smear-campaign|title=The Dirty Secrets of a Smear Campaign |access-date=27 March 2023 |date=27 March 2023 |magazine=The New Yorker |volume=99 |issue=7 |first1=David D. |last1=Kirkpatrick}} Alp was also paid by the UAE to publish 100 propaganda articles a year against Qatar.{{cite web|url=https://www.mediapart.fr/en/journal/france/040323/leaked-data-shows-extent-uaes-meddling-france|title=Leaked data shows extent of UAE's meddling in France|access-date=4 March 2023|website=MediaPart|date=4 March 2023 }}

Disinformation attacks on scientists and science, including attacks funded by the tobacco and fossil fuels industries, have been painstakingly documented in books such as Merchants of Doubt,{{cite journal |last1=Kitcher |first1=Philip |title=The Climate Change Debates |journal=Science |date=4 June 2010 |volume=328 |issue=5983 |pages=1230–1234 |doi=10.1126/science.1189312 |bibcode=2010Sci...328.1230K |s2cid=154865206 |url=https://www.science.org/doi/10.1126/science.1189312 |access-date=18 January 2023 |language=en |issn=0036-8075}}{{cite journal |last1=Levy |first1=Adam |date=30 May 2023 |title=Scientists warned about climate change in 1965. Nothing was done. |url=https://knowablemagazine.org/article/food-environment/2023/scientists-warned-climate-change-1965-podcast |journal=Knowable Magazine |language=en |doi=10.1146/knowable-052523-1 |doi-broken-date=4 June 2025 |doi-access=free}}{{cite book |last1=Oreskes |first1=Naomi |last2=Conway |first2=Erik M. |title=Merchants of doubt: how a handful of scientists obscured the truth on issues from tobacco smoke to global warming |date=2010 |publisher=Bloomsbury Press |location=New York |isbn=978-1-59691-610-4 |edition=1st U.S.}} Doubt Is Their Product,{{cite journal |author-first=Britt E. |author-last=Erickson |title=Manufacturing Uncertainty |journal=Chemical and Engineering News |volume=86 |issue=46 |pages=77–8 |date=November 17, 2008 |doi= 10.1021/cen-v086n046.p077|url=http://cen.acs.org/articles/86/i46/Manufacturing-Uncertainty.html|url-access=subscription }}{{cite book |last1=Michaels |first1=David |title=Doubt is their product: how industry's assault on science threatens your health |date=2008 |publisher=Oxford University Press |location=Oxford |isbn=978-0199719761}} and The Triumph of Doubt: Dark Money and the Science of Deception (2020).{{cite journal |last1=Kirshenbaum |first1=Sheril |title=The art of misleading the public The Triumph of Doubt: Dark Money and the Science of Deception David Michaels Oxford University Press, 2020. 344 pp. |journal=Science |date=14 February 2020 |volume=367 |issue=6479 |pages=747 |doi=10.1126/science.aba5495 |s2cid=211110439 |url=https://www.science.org/doi/10.1126/science.aba5495 |access-date=18 January 2023 |language=en |issn=0036-8075|url-access=subscription }}{{cite book |last1=Michaels |first1=David |title=The triumph of doubt: dark money and the science of deception |date=2020 |publisher=Oxford University Press |location=Oxford |isbn=978-0190922665}} While scientists, doctors and teachers are considered the most trustworthy professionals globally scientists are concerned about whether confidence in science has decreased. Sudip Parikh, CEO of the American Association for the Advancement of Science (AAAS) in 2022 is quoted as saying "We now have a significant minority of the population that's hostile to the scientific enterprise... We're going to have to work hard to regain trust." That said, at the same time that disinformation poses a threat, the widespread use of social media by scientists offers an unprecedented opportunity for scientific communication and engagement between scientists and the public, with the potential to increase public knowledge.{{cite journal |last1=Maibach |first1=Edward W. |last2=Uppalapati |first2=Sri Saahitya |last3=Orr |first3=Margaret |last4=Thaker |first4=Jagadish |title=Harnessing the Power of Communication and Behavior Science to Enhance Society's Response to Climate Change |journal=Annual Review of Earth and Planetary Sciences |date=31 May 2023 |volume=51 |issue=1 |pages=53–77 |doi=10.1146/annurev-earth-031621-114417 |bibcode=2023AREPS..51...53M |language=en |issn=0084-6597|doi-access=free }}

The American Council on Science and Health has advice for scientists facing a disinformation campaign, and notes that disinformation campaigns often incorporate some elements of truth to make them more convincing. The five recommendations include identifying and acknowledging any parts of the story that are actually true; explaining why other parts are untrue, out of context or manipulated; calling out motivations that may be behind the disinformation, such as financial interests or power; preparing an "accusation audit" in anticipation of further attacks; and maintaining calm and self-control.{{cite news |last1=Berezow |first1=Alex |title=How to Fight a Disinformation Campaign |url=https://www.acsh.org/news/2021/03/10/how-fight-disinformation-campaign-15391 |access-date=18 January 2023 |work=American Council on Science and Health |date=10 March 2021 |language=en}} Others recommend educating oneself about the platforms one uses and the privacy tools that platforms offer to protect personal information and to mute, block, and report online participants. Disinformers and online trolls are unlikely to engage in reasoned discussion or interact in good faith, and responding to them is rarely useful.

Studies clearly document the harassment of scientists, personally and in terms of scientific credibility. In 2021, a Nature survey reported that nearly 60% of scientists who had made public statements about COVID-19 had their credibility attacked. Attacks disproportionately affected those in nondominant identity groups such as women, transgender people, and people of color.{{cite news |last1=Abrams |first1=Zara |title= The anatomy of a misinformation attack |work=Monitor on Psychology |date=June 1, 2022 |url=https://www.apa.org/monitor/2022/06/news-misinformation-attack |access-date=16 May 2023|publisher=American Psychological Association}}

A highly visible example is Anthony S. Fauci. He is deeply respected nationally and internationally as an expert on infectious diseases. He also has been subjected to intimidation, harassment and death threats fueled by disinformation attacks and conspiracy theories.{{cite news |last1=Resneck Jr. |first1=Jack |title=Dr. Fauci's dedication to medical science served the world well |url=https://www.ama-assn.org/about/leadership/dr-fauci-s-dedication-medical-science-served-world-well |access-date=18 January 2023 |work=American Medical Association |date=December 28, 2022 |language=en}}{{cite news |last1=Bredow |first1=Rafaela von |title=Anthony Fauci's Life as a Right Wing Target: "The Evil in the World" |url=https://www.spiegel.de/international/world/anthony-fauci-s-life-as-a-right-wing-target-the-evil-in-the-world-a-f5b09c65-4193-4b90-89fd-657caf63b32c |access-date=18 January 2023 |work=Der Spiegel |date=17 November 2022 |language=en}}{{cite news |last1=Stacey |first1=Kiran |title=Anthony Fauci: America's doctor under siege |url=https://www.ft.com/content/e93fb980-acc7-4887-bacc-5a1994fe42cb |access-date=18 January 2023 |work=Financial Times |date=5 June 2021}} Despite those experiences, Fauci encourages early-career scientists "not to be deterred, because the satisfaction and the degree of contribution you can make to society by getting into public service and public health is immeasurable."{{cite journal |last1=Kozlov |first1=Max |title=Fauci responds to Musk's Twitter attack and rates world's COVID response |journal=Nature |date=13 December 2022 |volume=612 |issue=7941 |pages=599 |language=en |doi=10.1038/d41586-022-04432-7|pmid=36513820 |bibcode=2022Natur.612..599K |s2cid=254675192 |doi-access=free }}

= Undermining of collective action including voting =

Individual decisions, like whether or not to smoke, are major targets for disinformation. So are policymaking processes such as the formation of public health policy, the recommendation and adoption of policy measures, and the acceptance or regulation of processes and products. Public opinion and policy interact: public opinion and the popularity of public health measures can strongly influence government policy and the creation and enforcement of industry standards. Disinformation attempts to undermine public opinion and prevent the organization of collection actions, including policy debates, government action, regulation and litigation.

An important type of collective activity is the act of voting. In the 2017 Kenyan general election, 87% of Kenyans surveyed reported encountering disinformation before the August election, and 35% reported being unable to make an informed voting decision as a result. Disinformation campaigns often target specific groups such as black or Latino voters to discourage voting and civic engagement. Fake accounts and bots are used to amplify uncertainty about whether voting really matters, whether voters are "appreciated", and whose interests politicians care about.{{cite news |last1=Bond |first1=Shannon |title=Black And Latino Voters Flooded With Disinformation In Election's Final Days |url=https://www.npr.org/2020/10/30/929248146/black-and-latino-voters-flooded-with-disinformation-in-elections-final-days |access-date=19 January 2023 |work=NPR |date=October 30, 2020}}{{cite news |last1=Garcia-Navarro |first1=Lulu |last2=Bryant |first2=Ashley |title=Progressive Group Combats Disinformation Campaigns Aimed At Latino Voters |url=https://www.npr.org/2020/10/18/925069823/progressive-group-combats-disinformation-campaigns-aimed-at-latino-voters |access-date=19 January 2023 |date=October 18, 2020}} Microtargeting can present messages precisely designed for a chosen population, while geofencing can pinpoint people based on where they go, like churchgoers. In some cases, voter suppression attacks have circulated incorrect information about where and when to vote.{{cite news |last1=Lai |first1=Samantha |title=Data misuse and disinformation: Technology and the 2022 elections |url=https://www.brookings.edu/blog/techtank/2022/06/21/data-misuse-and-disinformation-technology-and-the-2022-elections/ |access-date=19 January 2023 |work=Brookings |date=21 June 2022}} During the 2020 U.S. Democratic primaries, disinformation narratives arose around the use of masks and the use of mail-in ballots, relating to whether and how people would vote.{{cite journal |last1=Chen |first1=Emily |last2=Chang |first2=Herbert |last3=Rao |first3=Ashwin |last4=Lerman |first4=Kristina |last5=Cowan |first5=Geoffrey |last6=Ferrara |first6=Emilio |title=COVID-19 misinformation and the 2020 U.S. presidential election |journal=Harvard Kennedy School Misinformation Review |date=3 March 2021 |doi=10.37016/mr-2020-57 |s2cid=233772524 |url=https://misinforeview.hks.harvard.edu/article/covid-19-misinformation-and-the-2020-u-s-presidential-election/ |access-date=19 January 2023|doi-access=free }}

=Undermining of functional government =

Disinformation strikes at the foundation of democratic government: "the idea that the truth is knowable and that citizens can discern and use it to govern themselves."{{cite book |last1=Brandt |first1=Jessica |last2=Ichihara |first2=Maiko |last3=Jalli |first3=Nuurrianti |last4=Shen |first4=Puma |last5=Sinpeng |first5=Aim |title=Impact of disinformation on democracy in Asia |date=14 December 2022 |publisher=Brookings Institution |url=https://www.brookings.edu/research/impact-of-disinformation-on-democracy-in-asia/}} Disinformation campaigns are designed by both foreign and domestic actors to gain political and economic advantage. The undermining of functional government weakens the rule of law and can enable both foreign and domestic actors to profit politically and economically. At home and abroad, the goal is to weaken opponents. Elections are an especially critical target, but the day-to-day ability to govern is also undermined.{{cite web |last1=Kalniete |first1=Sandra |title=Report on foreign interference in all democratic processes in the European Union, including disinformation {{!}} A9-0022/2022 |url=https://www.europarl.europa.eu/doceo/document/A-9-2022-0022_EN.html |website=European Parliament |access-date=21 January 2023 |language=en}}

The Oxford Internet Institute at Oxford University reports that in 2020, organized social media manipulation campaigns were active in 81 countries, an increase from 70 countries in 2019. 76 of those countries used disinformation attacks. The report describes disinformation as being produced globally "on an industrial scale".{{cite web |title=Social media manipulation by political actors now an industrial scale problem prevalent in over 80 countries – annual Oxford report |url=https://www.oii.ox.ac.uk/news-events/news/social-media-manipulation-by-political-actors-now-an-industrial-scale-problem-prevalent-in-over-80-countries-annual-oxford-report/ |access-date=27 January 2023 |website=Oxford Internet Institute |publisher=Oxford University |date=13 Jan 2021}}

A Russian operation known as the Internet Research Agency (IRA) spent thousands on social media ads to influence the 2016 United States presidential election, confuse the public on key political issues and sow discord. These political ads leveraged user data to micro-target certain populations and spread misleading information, with an end goal of exacerbating polarization and eroding public trust in political institutions.{{Cite journal|last1=Crain|first1=Matthew|last2=Nadler|first2=Anthony|date=2019|title=Political Manipulation and Internet Advertising Infrastructure|journal=Journal of Information Policy|volume=9|pages=370–410|doi=10.5325/jinfopoli.9.2019.0370|jstor=10.5325/jinfopoli.9.2019.0370|s2cid=214217187|issn=2381-5892|doi-access=free}} The Computational Propaganda Project at the Oxford Internet Institute found that the IRA's ads specifically sought to sow mistrust towards the U.S. government among Mexican Americans and discourage voter turnout among African Americans.{{Cite journal|last=Prier|first=Jarred|date=2017|title=Commanding the Trend: Social Media as Information Warfare|url=https://www.jstor.org/stable/26271634|journal=Strategic Studies Quarterly|volume=11|issue=4|pages=50–85|jstor=26271634|issn=1936-1815}}

An examination of twitter activity prior to the 2017 French presidential election indicates that 73% of the disinformation flagged by Le Monde was traceable to two political communities: one associated with François Fillon (right-wing, with 50.75% of the fake link shares) and another with Marine Le Pen (extreme-right wing, 22.21%). 6% of accounts in the Fillon community and 5% of the Le Pen community were early spreaders of disinformation. Debunking of the disinformation came from other communities, and was most often related to Emmanuel Macron (39.18% of debunks) and Jean-Luc Mélenchon (14% of debunks).{{cite journal |last1=Gaumont |first1=Noé |last2=Panahi |first2=Maziyar |last3=Chavalarias |first3=David |title=Reconstruction of the socio-semantic dynamics of political activist Twitter networks—Method and application to the 2017 French presidential election |journal=PLOS ONE |date=19 September 2018 |volume=13 |issue=9 |pages=e0201879 |doi=10.1371/journal.pone.0201879 |pmid=30231018 |pmc=6145593 |bibcode=2018PLoSO..1301879G |language=en |issn=1932-6203|doi-access=free }}

Another analysis, of the 2017 #MacronLeaks disinformation campaign, illustrates frequent patterns of election-related disinformation campaigns. Such campaigns often peak 1–2 days before an election. The scale of a campaign like #MacronLeaks can be comparable to the volume of regular discussion in that time period, suggesting that it can obtain considerable collective attention. About 18 percent of the users involved in #MacronLeaks were identifiable as bots. Spikes in bot content tended to occur slightly ahead of spikes in human-created content, suggesting bots were able to trigger cascades of disinformation. Some bot accounts showed a pattern of previous use: creation shortly before the 2016 U.S. presidential election, brief usage then, and no further activity until early May 2017, prior to the French election. Alt-right media personalities including Britain's Paul Joseph Watson and American Jack Posobiec prominently shared MacronLeaks content prior to the French election.{{cite journal |last1=Ferrara |first1=Emilio |title=Disinformation and social bot operations in the run up to the 2017 French presidential election |journal=First Monday |date=7 August 2017 |volume=22 |issue=8 |doi=10.5210/fm.v22i8.8005 |arxiv=1707.00086 |s2cid=9732472 |language=en |doi-access=free }} Experts worry that disinformation attacks will increasingly be used to influence national elections and democratic processes.

{{external media | width = 210px | float = right | headerimage= | video1 = [https://www.youtube.com/watch?v=SObIrZ19heg&t=13s&ab_channel=KnowableMagazine "The psychology and politics of conspiracy theories"], Knowable Magazine, October 27, 2021.}}

In A Lot of People Are Saying: The New Conspiracism and the Assault on Democracy (2020) Nancy L. Rosenblum and Russell Muirhead examine the history and psychology of conspiracy theories and the ways in which they are used to de-legitimize the political system. They distinguish between classical conspiracy theory in which actual issues and events (such as the assassination of John F. Kennedy) are examined and combined to create a theory, and a new form of "conspiracism without theory" that relies on repeating false statements and hearsay without factual grounding.{{cite journal |last1=Nacos |first1=Brigitte L. |title=A Lot of People Are Saying: The New Conspiracism and the Assault on Democracy, Russell Muirhead and Nancy L. Rosenblum |journal=Political Science Quarterly |date=2021 |volume=136 |issue=3 |doi=10.1002/polq.13224 |s2cid=239622944 |url=https://www.psqonline.org/article.cfm?IDArticle=20203 |access-date=27 October 2021|url-access=subscription }}{{cite journal |last1=Miller |first1=Greg |title=The enduring allure of conspiracies |journal=Knowable Magazine |date=14 January 2021|doi-access=free |doi=10.1146/knowable-011421-2 |url=https://knowablemagazine.org/article/mind/2021/the-enduring-allure-conspiracies |access-date=9 December 2021}}

Such disinformation exploits human bias towards accepting new information. Humans constantly share information and rely on others to provide information they cannot verify for themselves. Much of that information will be true, whether they ask if it is cold outside or cold in Antarctica. As a result, they tend to believe what they hear. Studies show an "illusory truth effect": the more often people hear a claim, the more likely they are to consider it true. This is the case even when people identify a statement as false the first time they see it; they are likely to rank the probability that it is true higher after multiple exposures.{{cite journal |last1=Brashier |first1=Nadia M. |last2=Marsh |first2=Elizabeth J. |title=Judging Truth |journal=Annual Review of Psychology |date=4 January 2020 |volume=71 |issue=1 |pages=499–515 |doi=10.1146/annurev-psych-010419-050807 |pmid=31514579 |s2cid=202569061 |language=en |issn=0066-4308|doi-access=free }}

Social media is particularly dangerous as a source of disinformation because robots and multiple fake accounts are used to repeat and magnify the impact of false statements. Algorithms track what users click on and recommend content similar to what users have chosen, creating confirmation bias and filter bubbles. In more tightly focused communities an echo chamber effect is enhanced.{{cite web |title=Digital Media Literacy: What is an Echo Chamber? |url=https://edu.gcfglobal.org/en/digital-media-literacy/what-is-an-echo-chamber/1/ |website=GCFGlobal.org |access-date=23 January 2023 |language=en}}{{Cite journal|last=Peck|first=Andrew|date=2020|title=A Problem of Amplification: Folklore and Fake News in the Age of Social Media|url=https://www.jstor.org/stable/10.5406/jamerfolk.133.529.0329|journal=The Journal of American Folklore|volume=133|issue=529|pages=329–351|doi=10.5406/jamerfolk.133.529.0329|jstor=10.5406/jamerfolk.133.529.0329|s2cid=243130538|issn=0021-8715|url-access=subscription}}{{Cite journal|last=Unver|first=H. Akin|title=Politics of Automation, Attention, and Engagement|date=2017|url=https://www.jstor.org/stable/26494368|journal=Journal of International Affairs|volume=71|issue=1|pages=127–146|jstor=26494368|issn=0022-197X}}

Autocrats have employed domestic voter disinformation attacks to cover up electoral corruption. Voter disinformation can include public statements that assert local electoral processes are legitimate and statements that discredit electoral monitors. Public-relations firms may be hired to execute specialized disinformation campaigns, including media advertisements and behind-the-scenes lobbying, to push the narrative of an honest and democratic election.

Independent monitoring of the electoral process is essential to combatting electoral disinformation. Monitoring can include both citizen election monitors and international observers, as long as they are credible. Norms for accurate characterization of elections are based on ethical principles, effective methodologies, and impartial analysis. Democratic norms emphasize the importance of open electoral data, the free exercise of political rights, and protection for human rights.{{Cite journal|last=Merloe|first=Patrick|date=2015|title=Election Monitoring Vs. Disinformation|url=https://muse.jhu.edu/content/crossref/journals/journal_of_democracy/v026/26.3.merloe.html|journal=Journal of Democracy|language=en|volume=26|issue=3|pages=79–93|doi=10.1353/jod.2015.0053|s2cid=146751430|issn=1086-3214|url-access=subscription}}

=Increasing polarization and legitimizing violence =

Disinformation attacks can increase political polarization and alter public discourse. Foreign manipulation campaigns may attempt to amplify extreme positions and weaken a target society, while domestic actors may try to demonize political opponents.

States with highly polarized political landscapes and low public trust in local media and government are particularly vulnerable to disinformation attacks.{{Cite journal|last1=Humprecht|first1=Edda|last2=Esser|first2=Frank|last3=Van Aelst|first3=Peter|date=July 2020|title=Resilience to Online Disinformation: A Framework for Cross-National Comparative Research|journal=The International Journal of Press/Politics|language=en|volume=25|issue=3|pages=493–516|doi=10.1177/1940161219900126|s2cid=213349525|issn=1940-1612|doi-access=free|hdl=10067/1661680151162165141|hdl-access=free}}{{cite journal |last1=Kleinfeld |first1=Rachel |title=The Rise of Political Violence in the United States |journal=Journal of Democracy |date=2021 |volume=32 |issue=4 |pages=160–176 |doi=10.1353/jod.2021.0059 |s2cid=239879073 |url=https://www.journalofdemocracy.org/articles/the-rise-of-political-violence-in-the-united-states/|doi-access=free |url-access=subscription }}

There is concern that Russia will employ disinformation, propaganda, and intimidation to destabilize NATO members, such as the Baltic states and coerce them into accepting Russian narratives and agendas.

During the Russo-Ukrainian War of 2014, Russia combined traditional combat warfare with disinformation attacks in a form of hybrid warfare in its offensive strategy, to sow doubt and confusion among enemy populations and intimidate adversaries, erode public trust in Ukrainian institutions, and boost Russia's reputation and legitimacy.{{Cite journal|last=Wither|first=James K.|date=2016|title=Making Sense of Hybrid Warfare |url=https://connections-qj.org/article/making-sense-hybrid-warfare |journal=Connections|volume=15|issue=2|pages=73–87|doi=10.11610/Connections.15.2.06|jstor=26326441|issn=1812-1098|doi-access=free}} Since escalating the Russo-Ukrainian War with the 2022 Russian invasion of Ukraine, Russia's pattern of disinformation has been described by CBC News as "Deny, deflect, distract".

Thousands of stories have been debunked, including doctored photographs and deepfakes. At least 20 main "themes" are being promoted by Russia propaganda, targeting audiences far beyond Ukraine and Russia. Many of these try to reinforce ideas that Ukraine is somehow Nazi-controlled, that its military forces are weak, and that damage and atrocities are due to Ukrainian, not Russian, actions.{{cite news |last1=Zabjek |first1=Alexandra |title='Deny, deflect, distract': How Russia spreads disinformation about the war in Ukraine |url=https://www.cbc.ca/news/politics/disinformation-ukraine-stop-fake-org-1.6721522 |access-date=23 January 2023 |work=CBC News |date=January 22, 2023}} Many of the images they examine are shared on Telegraph. Government organizations and independent journalistic groups such as Bellingcat work to confirm or deny such reports, often using open-source data and sophisticated tools to identify where and when information has originated and whether claims are legitimate. Bellingcat works to provide an accurate account of events as they happen and to create a permanent, verified, longer-term record.{{cite news |last1=Nicholson |first1=Katie |title=There's a flood of disinformation about Russia's invasion of Ukraine. Here's who's sorting it out |url=https://www.cbc.ca/news/world/fact-checkers-ukraine-1.6365682 |access-date=23 January 2023 |work=CBC News |date=Feb 27, 2022}}

Fear-mongering and conspiracy theories are used to encourage polarization, to promote exclusionary narratives, and to legitimize hate speech and aggression.{{cite journal |last1=Atran |first1=Scott |title=Psychology of Transnational Terrorism and Extreme Political Conflict |journal=Annual Review of Psychology |date=4 January 2021 |volume=72 |issue=1 |pages=471–501 |doi=10.1146/annurev-psych-010419-050800 |pmid=32898462 |s2cid=221572429 |url=https://www.annualreviews.org/doi/full/10.1146/annurev-psych-010419-050800 |access-date=23 January 2023 |language=en |issn=0066-4308|url-access=subscription }} As has been painstakingly documented, the period leading up to the Holocaust was marked by repeated disinformation and increasing persecution by the Nazi government,{{cite book |last1=Fischer |first1=Conan |title=The rise of the Nazis |date=2002 |publisher=Manchester University Press |location=Manchester, UK |isbn=0-7190-6067-2 |pages=47–49 |edition=Second}}{{cite news |last1=Dunkel |first1=Tom |title=How Hate-Fueled Misinformation and Propaganda Grew in Nazi Germany |url=https://lithub.com/how-hate-fueled-misinformation-and-propaganda-grew-in-nazi-germany/ |access-date=27 January 2023 |work=Literary Hub |date=13 October 2022}} culminating in the mass murder of 165,200 German Jews{{Cite encyclopedia |encyclopedia=Holocaust Encyclopedia |publisher=United States Holocaust Memorial Museum|url=https://encyclopedia.ushmm.org/content/en/article/jewish-losses-during-the-holocaust-by-country|title = Jewish Losses during the Holocaust: By Country}} by a "genocidal state".{{cite book |last1=Berenbaum |first1=Michael |title=The world must know: the history of the Holocaust as told in the United States Holocaust Memorial Museum |date=2006 |publisher=United States Holocaust Memorial Museum |location=Washington, D.C. |isbn=978-0-8018-8358-3 |page=103 |edition=2nd }}

Populations in Africa, Asia, Europe and South America today are considered to be at serious risk for human rights abuses. Changing conditions in the United States have also been identified as increasing risk factors for violence.

Elections are particularly tense political transition points, emotionally charged at any time, and increasingly targeted by disinformation. These conditions increase the risk of individual violence, civil unrest, and mass atrocities. Countries such as Kenya whose history has involved ethnic or election-related violence, foreign or domestic interference, and a high reliance on the use of social media for political discourse, are considered to be at higher risk.

The United Nations Framework of Analysis for Atrocity Crimes identifies elections as an atrocity risk indicator: disinformation can act as a threat multiplier for atrocity crime. Recognition of the seriousness of this problem is essential, to mobilize governments, civic society, and social media platforms to take steps to prevent both online and offline harm.

Disinformation channels

= Scientific research =

Disinformation attacks target the credibility of science, particularly in areas of public health and environmental science.{{cite journal |last1=Goldberg |first1=Rebecca F. |last2=Vandenberg |first2=Laura N. |date=26 March 2021 |title=The science of spin: targeted strategies to manufacture doubt with detrimental effects on environmental and public health |journal=Environmental Health |volume=20 |issue=1 |pages=33 |doi=10.1186/s12940-021-00723-0 |issn=1476-069X |pmc=7996119 |pmid=33771171 |bibcode=2021EnvHe..20...33G |doi-access=free}} Examples include denying the dangers of leaded gasoline,{{cite journal |last1=Rosner |first1=D |last2=Markowitz |first2=G |date=April 1985 |title=A 'gift of God'?: The public health controversy over leaded gasoline during the 1920s. |journal=American Journal of Public Health |language=en |volume=75 |issue=4 |pages=344–352 |doi=10.2105/AJPH.75.4.344 |issn=0090-0036 |pmc=1646253 |pmid=2579591}}{{cite news |last1=Kitman |first1=Jamie Lincoln |date=2 March 2000 |title=The Secret History of Lead |work=The Nation |url=https://www.thenation.com/article/archive/secret-history-lead/ |access-date=17 January 2023}} smoking,{{cite journal |last1=Tan |first1=Andy S. L. |last2=Bigman |first2=Cabral A. |date=October 2020 |title=Misinformation About Commercial Tobacco Products on Social Media—Implications and Research Opportunities for Reducing Tobacco-Related Health Disparities |journal=American Journal of Public Health |volume=110 |issue=S3 |pages=S281–S283 |doi=10.2105/AJPH.2020.305910 |pmc=7532322 |pmid=33001728}}{{cite journal |last1=Brandt |first1=AM |date=January 2012 |title=Inventing conflicts of interest: a history of tobacco industry tactics. |journal=American Journal of Public Health |volume=102 |issue=1 |pages=63–71 |doi=10.2105/AJPH.2011.300292 |pmc=3490543 |pmid=22095331}}{{cite news |last1=Hulac |first1=Benjamin |date=July 20, 2016 |title=Tobacco and Oil Industries Used Same Researchers to Sway Public |language=en |work=Scientific American |url=https://www.scientificamerican.com/article/tobacco-and-oil-industries-used-same-researchers-to-sway-public1/ |access-date=17 January 2023}} and climate change.{{cite news |last1=Pierre |first1=Jeffrey |last2=Neuman |first2=Scott |date=October 27, 2021 |title=How decades of disinformation about fossil fuels halted U.S. climate policy |work=All Things Considered |agency=National Public Radio |url=https://www.npr.org/2021/10/27/1047583610/once-again-the-u-s-has-failed-to-take-sweeping-climate-action-heres-why |access-date=17 January 2023}}{{cite journal |last1=Farrell |first1=Justin |date=18 March 2019 |title=The growth of climate change misinformation in US philanthropy: evidence from natural language processing |journal=Environmental Research Letters |volume=14 |issue=3 |pages=034013 |bibcode=2019ERL....14c4013F |doi=10.1088/1748-9326/aaf939 |s2cid=158732419 |doi-access=free}}

A pattern for disinformation attacks involving scientific sources developed in the 1920s. It illustrates tactics that continue to be used.{{cite news |last1=Alden |first1=Timothy |title=Propaganda in the war over climate change - Timothy Alden |url=https://timesofmalta.com/articles/view/propaganda-in-the-war-over-climate-change-timothy-alden.748596 |access-date=18 January 2023 |work=Times of Malta |date=November 9, 2019 |language=en-gb}}

As early as 1910, industrial toxicologist Alice Hamilton documented the dangers associated with exposure to lead.{{cite book | last1 =Sicherman | first1 =Barbara | first2 =Carol Hurd | last2 =Green | title =Notable American Women: The Modern Period, A Biographical Dictionary | publisher =Belknap Press of Harvard University | year =1980 | location =Cambridge, Massachusetts | pages = 303–306 | url =https://archive.org/details/notableamericanw00sich/page/304 | isbn =9780674627321 }} In the 1920s, Charles Kettering, Thomas Midgley Jr. and Robert A. Kehoe of the Ethyl Gasoline Corporation introduced lead into gasoline. Following the sensational madness and deaths of workers at their plants, a Public Health Service conference was held in 1925, to review the use of tetraethyllead (TEL). Hamilton and others warned of leaded gasoline's potential danger to people and the environment. They questioned the research methodology used by Kehoe, who claimed that lead was a "natural" part of the environment and that high lead levels in workers were "normal".{{cite journal |last1=Hernberg |first1=Sven |title=Lead Poisoning in a Historical Perspective |journal=American Journal of Industrial Medicine |date=2000 |volume=38 |issue=3 |pages=244–254 |doi=10.1002/1097-0274(200009)38:3<244::AID-AJIM3>3.0.CO;2-F |pmid=10940962 |url=https://www.biologicaldiversity.org/campaigns/get_the_lead_out/pdfs/health/Hernberg_2000.pdf |access-date=18 January 2023}}{{cite journal |last1=Kovarik |first1=William |s2cid=44633845 |title=Ethyl-leaded gasoline: How a classic occupational disease became an international public health disaster |journal=International Journal of Occupational and Environmental Health |date=2005 |volume=11 |issue=4 |pages=384–397 |doi=10.1179/oeh.2005.11.4.384 |pmid=16350473}} Kettering, Midgley and Kehoe emphasized that a gas additive was needed, and argued that until "it is shown ... that an actual danger to the public is had as a result", the company should be allowed to produce its product. Rather than requiring industry to show that their product was safe before it could be sold, the burden of proof was placed on public health advocates to show uncontestable proof that harm had occurred.{{cite book |last1=Moore |first1=Colleen F. |title=Children and Pollution: Children, Pollution, and Why Scientists Disagree |date=8 April 2009 |publisher=Oxford University Press |isbn=978-0-19-045267-4 |pages=3–10 |url=https://books.google.com/books?id=uFTiBwAAQBAJ&pg=PA3 |language=en}}{{cite journal |last1=Rosner |first1=David |last2=Markowitz |first2=Gerald |title=A 'Gift of God'?: The Public Health Controversy over Leaded Gasoline during the 1920s |journal=American Journal of Public Health |date=April 1985 |volume=75 |issue=4 |pages=344–352 |doi=10.2105/ajph.75.4.344 |pmid=2579591 |pmc=1646253}} Critics of TEL were described as "hysterical".{{cite news |last1=Hanna-Attisha |first1=Mona |title=Perspective {{!}} A proposed EPA rule prioritizes industry profit over people's lives |url=https://www.washingtonpost.com/outlook/2019/12/09/proposed-epa-rule-prioritizes-industry-profit-over-peoples-lives/ |access-date=18 January 2023 |newspaper=Washington Post |date=December 9, 2019}} With industry support, Kehoe went on to became a prominent industry expert and advocate for the position that leaded gasoline was safe, holding "an almost complete monopoly" on research in the area.{{cite journal |author=Herbert L. Needleman |author-link=Herbert L. Needleman |title=Clair Patterson and Robert Kehoe: Two Views on Lead Toxicity| journal=Environmental Research |year=1998 |pages=79–85|pmid=9719611 |doi=10.1006/enrs.1997.3807 |volume=78 |issue=2|bibcode=1998ER.....78...79N }} It would be decades before his work was finally discredited. In 1988, the EPA estimated that over the previous 60 years, 68 million children suffered high toxic exposure to lead from leaded fuels.{{cite web |url=http://www.todayifoundout.com/index.php/2011/11/why-lead-used-to-be-added-to-gasoline/ |title=Why lead used to be added to gasoline |date=15 November 2011 |access-date=2017-12-05 |url-status=live |archive-url=https://web.archive.org/web/20171003081233/http://www.todayifoundout.com/index.php/2011/11/why-lead-used-to-be-added-to-gasoline/ |archive-date=2017-10-03 }} A 2022 review reported that the use of lead in gasoline was linked to neurodevelopmental disabilities in children and to neurobehavioral deficits, cardiovascular and kidney disease, and premature deaths in adults.{{Cite journal |last1=Angrand |first1=Ruth C. |last2=Collins |first2=Geoffrey |last3=Landrigan |first3=Philip J. |last4=Thomas |first4=Valerie M. |date=2022-12-27 |title=Relation of blood lead levels and lead in gasoline: an updated systematic review |journal=Environmental Health |volume=21 |issue=1 |pages=138 |doi=10.1186/s12940-022-00936-x |doi-access=free |issn=1476-069X |pmc=9793664 |pmid=36572887|bibcode=2022EnvHe..21..138A }}

By the 1950s, the production and use of biased "scientific" research was part of a consistent "disinformation playbook", used by companies in the tobacco,{{cite journal |last1=Muggli |first1=Monique E. |last2=Hurt |first2=Richard D. |last3=Blanke |first3=D. Douglas |title=Science for hire: a tobacco industry strategy to influence public opinion on secondhand smoke |journal=Nicotine & Tobacco Research |date=June 2003 |volume=5 |issue=3 |pages=303–314 |doi=10.1080/1462220031000094169 |pmid=12791525 |url=https://pubmed.ncbi.nlm.nih.gov/12791525/ |access-date=18 January 2023 |issn=1462-2203}} pesticide{{cite news |last1=Drugmand |first1=Dana |title=Pesticide Industry 'Helped Write' Disinformation Playbook Used by Big Oil and Big Tobacco, Report Reveals |url=https://www.desmog.com/2022/12/09/pesticide-industry-science-denial-monsanto-glyphosate-cancer/ |work=DeSmog |date=9 December 2022}} and fossil fuels industries.{{cite news |title=Climate disinformation continues to leave a mark as world gets hotter |url=https://www.pbs.org/newshour/world/climate-disinformation-continues-to-leave-a-mark-as-world-gets-hotter |access-date=18 January 2023 |work=PBS NewsHour |date=26 July 2022 |language=en-us}} In many cases, the same researchers, research groups, and public relations firms were hired by multiple industries. They repeatedly argued that products were safe while knowing that they were unsafe. When assertions of safety were challenged, it was argued that the products were necessary. Through coordinated and widespread campaigns, they worked to influence public opinion and to manipulate government officials and regulatory agencies, to prevent regulatory or legal action that might interfere with profits.

Similar tactics continue to be used by scientific disinformation campaigns. When proof of harm is presented, it is argued that the proof is not sufficient. The argument that more proof is needed is used to put off action to some future time. Delays are used to block attempts to limit or regulate industry, and to avoid litigation, while continuing to profit. Industry-funded experts carry out research that all too often can be challenged on methodological grounds as well as over conflicts of interest. Disinformers use bad research as a basis for claiming that scientists are not in agreement, and to generate specific claims as part of a disinformation narrative. Opponents are often attacked on a personal level as well as in terms of their scientific work.{{cite journal |last1=Reed |first1=Genna |last2=Hendlin |first2=Yogi |last3=Desikan |first3=Anita |last4=MacKinney |first4=Taryn |last5=Berman |first5=Emily |last6=Goldman |first6=Gretchen T. |author-link6=Gretchen Goldman |date=1 December 2021 |title=The disinformation playbook: how industry manipulates the science-policy process—and how to restore scientific integrity |journal=Journal of Public Health Policy |language=en |volume=42 |issue=4 |pages=622–634 |doi=10.1057/s41271-021-00318-6 |issn=1745-655X |pmc=8651604 |pmid=34811464}}{{cite news |last1=Readfearn |first1=Graham |title=Doubt over climate science is a product with an industry behind it {{!}} Graham Readfearn |url=https://www.theguardian.com/environment/planet-oz/2015/mar/05/doubt-over-climate-science-is-a-product-with-an-industry-behind-it |access-date=18 January 2023 |work=The Guardian |date=5 March 2015 |language=en}}

A tobacco industry memo summarized this approach by saying "Doubt is our product". Scientists generally consider a question in terms of the likelihood that a conclusion is supported, given the weight of the best available scientific evidence. Evidence tends to involve measurement, and measurement introduces a potential for error. A scientist may say that available evidence is sufficient to support a conclusion about a problem, but will rarely claim that a problem is fully understood or that a conclusion is 100% certain. Disinformation rhetoric tries to undermine science and sway public opinion by using a "doubt strategy". Reframing the normal scientific process, disinformation often suggests that anything less than 100% certainty implies doubt, and that doubt means there is no consensus about an issue. Disinformation attempts to undermine both certainty about a particular issue and about science itself.{{cite journal |author-first=Carl F. |author-last=Cranor |title=Public Health: The Tobacco Strategy Entrenched |journal=Science |volume=321 |issue=5894 |pages=1296–7 |date=5 September 2008 |doi= 10.1126/science.1162339|s2cid=153706560 |url=http://www.sciencemag.org/cgi/content/summary/321/5894/1296|url-access=subscription }}

Decades of disinformation attacks have considerably eroded public belief in science.

Scientific information can become distorted as it is transferred among primary scientific sources, the popular press, and social media. This can occur both intentionally and unintentionally.

Some features of current academic publishing like the use of preprint servers make it easier for inaccurate information to become public, particularly if the information reported is novel or sensational.{{cite journal |last1=West |first1=Jevin D. |last2=Bergstrom |first2=Carl T. |title=Misinformation in and about science |journal=Proceedings of the National Academy of Sciences |date=13 April 2021 |volume=118 |issue=15 |pages=e1912444117 |doi=10.1073/pnas.1912444117 |pmid=33837146 |pmc=8054004 |bibcode=2021PNAS..11812444W |language=en |issn=0027-8424|doi-access=free }}

Steps to protect science from disinformation and interference include both individual actions on the part of scientists, peer reviewers, and editors, and collective actions via research, granting, and professional organizations, and regulatory agencies.{{cite journal |last1=Bergstrom |first1=Carl T. |title=Eight rules to combat medical misinformation |journal=Nature Medicine |date=December 2022 |volume=28 |issue=12 |pages=2468 |doi=10.1038/s41591-022-02118-1 |pmid=36513887 |s2cid=254661855 |language=en |issn=1078-8956|doi-access=free }}{{cite journal |last1=Bergstrom |first1=Carl T. |last2=West |first2=Jevin D. |title=How publishers can fight misinformation in and about science and medicine |journal=Nature Medicine |date=7 July 2023 |volume=29 |issue=9 |pages=2174–2176 |doi=10.1038/s41591-023-02411-7 |pmid=37420100 |s2cid=259369061 |url=https://www.nature.com/articles/s41591-023-02411-7.epdf |language=en |issn=1078-8956|url-access=subscription }}

= Traditional media outlets =

Traditional media channels can be used to spread disinformation. For example, Russia Today is a state-funded news channel that is broadcast internationally. It aims to boost Russia's reputation abroad and also depict Western nations, such as the U.S., in a negative light. It has served as a platform to disseminate propaganda and conspiracy theories intended to mislead and misinform its audience.

Within the United States, sharing of disinformation and propaganda has been associated with the development of increasingly "partisan" media, most strongly in right-wing sources such as Breitbart, The Daily Caller, and Fox News.{{cite book|last1=Faris|first1=Robert|title=Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election|last2=Roberts|first2=Hal|last3=Etling|first3=Bruce|date=August 8, 2017|publisher=Berkman Center for Internet & Society|pages=72|ssrn=3019414 |oclc=1048396744|url=https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3019414 |quote="Disinformation and propaganda from dedicated partisan sites on both sides of the political divide played a much greater role in the election. It was more rampant, though, on the right than on the left, as it took root in the dominant partisan media on the right, including Breitbart, The Daily Caller, and Fox News."}} As local news outlets have declined, there has been an increase in partisan media outlets that "masquerade" as local news sources.{{cite book |last1=Ardia |first1=David |last2=Ringel |first2=Evan |last3=Smith Ekstrand |first3=Victoria |last4=Fox |first4=Ashley |title=Addressing the decline of local news, rise of platforms, and spread of mis- and disinformation online A Summary of current research and policy proposals |date=2023 |publisher=UNC Center for Media Law and Policy, University of North Carolina at Chapel Hill |url=https://citap.unc.edu/news/local-news-platforms-mis-disinformation/}}{{Cite book |last1=Blevins |first1=Jeffrey Layne |first2= James Jaehoon |last2=Lee |title=Social Media, Social Justice and the Political Economy of Online Networks |publisher=University of Cincinnati Press |year=2022 |isbn=9781947602847 |location=Cincinnati |language=en}} The impact of partisanship and its amplification through the media is documented. For example, attitudes to climate legislation were bipartisan in the 1990s but became intensely polarized by 2010. While media messaging on climate from Democrats increased between 1990 and 2015 and tended to support the scientific consensus on climate change, Republican messaging around climate decreased and became more mixed.

A "gateway belief" that affects people's acceptance of scientific positions and policies is their understanding of the extent of scientific agreement on a topic. Undermining scientific consensus is therefore a frequent disinformation tactic. Indicating that there is a scientific consensus (and explaining the science involved) can help to counter misinformation. Indicating the broad consensus of experts can help to align people's perceptions and understandings with the empirical evidence. Presenting messages in a way that aligns with someone's cultural frame of reference makes them more likely to be accepted.

It is important to avoid false balance, in which opposing claims are presented in a way that is out of proportion to the actual evidence for each side. One way to counter false balance is to present a weight-of-evidence statement that explicitly indicates the balance of evidence for different positions.{{cite journal |last1=Imundo |first1=Megan N. |last2=Rapp |first2=David N. |title=When fairness is flawed: Effects of false balance reporting and weight-of-evidence statements on beliefs and perceptions of climate change. |journal=Journal of Applied Research in Memory and Cognition |date=June 2022 |volume=11 |issue=2 |pages=258–271 |doi=10.1016/j.jarmac.2021.10.002 |s2cid=245175824 |url=https://doi.org/10.1016/j.jarmac.2021.10.002 |access-date=15 June 2023 |language=en |issn=2211-369X|url-access=subscription }}{{cite news |last1=Dunwoody |first1=Sharon |title=Weight-of-Evidence Reporting: What Is It? Why Use It? |url=https://niemanreports.org/articles/weight-of-evidence-reporting-what-is-it-why-use-it/ |work=Nieman Reports |date=December 15, 2005}}

= Social media =

Perpetrators primarily use social media channels as a medium to spread disinformation, using a variety of tools.{{cite news |last1=Robins-Early |first1=Nick |title=Disinformation for profit: scammers cash in on conspiracy theories |url=https://www.theguardian.com/media/2022/feb/20/facebook-disinformation-ottawa-social-media |access-date=26 January 2023 |work=The Guardian |date=21 February 2022 |language=en}} Researchers have compiled multiple actions through which disinformation attacks occur on social media, which are summarized in the table below.{{Cite web |title=Disinformation glossary: 150+ Terms to Understand the Information Disorder |url=https://www.disinfo.eu/publications/disinformation-glossary-150-terms-to-understand-the-information-disorder/ |access-date=2023-11-08 |website=EU DisinfoLab |language=en-US}}{{Cite journal |last=Jack |first=Caroline |date=2017-08-09 |title=Lexicon of lies: terms for problematic information |url=https://apo.org.au/node/183786 |journal=Data & Society Research Institute |language=en}}

class="wikitable"

|+

! colspan="2" |Disinformation attack modes on social media.

Term

!Description

Algorithms

|Algorithms are leveraged to amplify the spread of disinformation. Algorithms filter and tailor information for users and modify the content they consume.{{Cite journal |last=Sacasas |first=L. M. |date=2020 |title=The Analog City and the Digital City |url=https://www.jstor.org/stable/26898497 |journal=The New Atlantis |issue=61 |pages=3–18 |doi= |issn=1543-1215 |jstor=26898497}} A study found that algorithms can be radicalization pipelines because they present content based on its user engagement levels. Users are drawn more to radical, shocking, and click-bait content.{{Cite journal |last1=Brogly |first1=Chris |last2=Rubin |first2=Victoria L. |date=2018 |title=Detecting Clickbait: Here's How to Do It / Comment détecter les pièges à clic |url=https://muse.jhu.edu/article/743050 |journal=Canadian Journal of Information and Library Science |language=en |volume=42 |issue=3 |pages=154–175 |issn=1920-7239}} As a result, extremist, attention-grabbing posts can garner high levels of engagement through algorithms. Disinformation campaigns may leverage algorithms to amplify their extremist content and sow radicalization online.{{Cite journal |last=Heldt |first=Amélie |date=2019 |title=Let's Meet Halfway: Sharing New Responsibilities in a Digital Age |journal=Journal of Information Policy |volume=9 |pages=336–369 |doi=10.5325/jinfopoli.9.2019.0336 |issn=2381-5892 |jstor=10.5325/jinfopoli.9.2019.0336 |s2cid=213340236 |doi-access=free}}

Astroturfing

|A centrally coordinated campaign that mimics grassroots activism by making participants pretend to be ordinary citizens. Astroturfing is putting out overwhelming amounts of content promoting similar messages from multiple fake accounts. This gives an impression of widespread consensus around a message, simulating a grassroots response while hiding its origin. Flooding is the spamming of social media with messages to shape a narrative or drown out opposition. Repeated exposure to a message is more likely to establish it in someone's mind. Disinformation actors will often tailor messages to a particular audience, to engage with individuals and build credibility with them, before exposing them to more extreme or misleading views.{{cite web |title=Tactics of Disinformation |url=https://www.cisa.gov/sites/default/files/publications/tactics-of-disinformation_508.pdf |access-date=18 January 2023 |website=Cybersecurity and Infrastructure Security Agency (CISA)}}{{cite book |last1=Verkamp |first1=John-Paul |url=https://www.usenix.org/conference/foci13/workshop-program/presentation/verkamp |title=3rd USENIX Workshop on Free and Open Communications on the Internet (FOCI 13) |last2=Gupta |first2=Minaxi |date=2013 |publisher=USENIX Association |location=Washington, D.C. |language=en |chapter=Five Incidents, One Theme: Twitter Spam as a Weapon to Drown Voices of Protest}}{{cite journal |last1=Piña-García |first1=C. A. |last2=Espinoza |first2=A. |date=31 December 2022 |title=Coordinated campaigns on Twitter during the coronavirus health crisis in Mexico |journal=Tapuya: Latin American Science, Technology and Society |language=en |volume=5 |issue=1 |doi=10.1080/25729861.2022.2035935 |s2cid=248055226 |issn=2572-9861|doi-access=free }}

Bots

|Bots are automated agents that can produce and spread content on online social platforms. Many bots can engage in basic interactions with other bots and humans. In disinformation attack campaigns, they are leveraged to rapidly disseminate disinformation and breach digital social networks. Bots can produce the illusion that one piece of information is coming from a variety of different sources. In doing so, disinformation attack campaigns make their content seem believable through repeated and varied exposure.{{Cite book |last=Kirdemir |first=Baris |date=2019 |title=Hostile Influence and Emerging Cognitive Threats in Cyberspace |url=https://www.jstor.org/stable/resrep21052 |publisher=Centre for Economics and Foreign Policy Studies |jstor=resrep21052}} By flooding social media channels with repeated content, bots can also alter algorithms and shift online attention to disinformation content. Influence operations can spread content that serves as training data for large language models (LLM) in order to influence the output produced by popular chatbots.{{Cite web |last=McCurdy |first=Will |date=2025-03-08 |title=Russian Disinformation 'Infects' Popular AI Chatbots |url=https://www.pcmag.com/news/russian-disinformation-infects-popular-ai-chatbots |access-date=2025-03-09 |website=PCMag |language=en}}{{Cite web |last=Fried |first=Ina |date=2025-03-06 |title=AI chatbots echo Russian disinformation, report warns |url=https://www.axios.com/2025/03/06/exclusive-russian-disinfo-floods-ai-chatbots-study-finds |access-date=2025-03-09 |website=Axios |language=en}}{{Cite web |last=Maxwell |first=Thomas |date=2025-03-07 |title=Russia Is 'Grooming' Global AI Models to Cite Propaganda Sources |url=https://gizmodo.com/russia-is-grooming-global-ai-models-to-cite-propaganda-sources-2000573160 |access-date=2025-03-09 |website=Gizmodo |language=en-US}} This has been termed "LLM grooming."{{Cite web |last=Goudarzi |first=Sara |date=2025-03-26 |title=Russian networks flood the Internet with propaganda, aiming to corrupt AI chatbots |url=https://thebulletin.org/2025/03/russian-networks-flood-the-internet-with-propaganda-aiming-to-corrupt-ai-chatbots/ |access-date=2025-04-10 |website=Bulletin of the Atomic Scientists |language=en-US}}

Clickbait

|The deliberate use of misleading headlines and thumbnails to increase online traffic for profit or popularity

Conspiracy theories

|Rebuttals of official accounts that propose alternative explanations in which individuals or groups act in secret

Culture wars

|A phenomenon in which multiple groups of people, who hold entrenched values, attempt to steer public policy contentiously

Deep fakes

|A deep fake is digital content (audio and video) that has been manipulated. Deep fake technology can be harnessed to defame, blackmail, and impersonate. Due to its low costs and efficiency, deep fakes can be used to spread disinformation more quickly and in greater volume than humans can. Disinformation attack campaigns may leverage deep fake technology to generate disinformation concerning people, states, or narratives. Deep fake technology can be weaponized to mislead an audience and spread falsehoods.{{Cite web |title=Weaponised deep fakes: National security and democracy on JSTOR |url=https://www.jstor.org/stable/resrep25129 |access-date=2020-11-12 |website=www.jstor.org |language=en}}

Echo chambers

|An epistemic environment in which participants encounter beliefs and opinions that coincide with their own

Hoax

|News in which false facts are presented as legitimate

Fake news

|The deliberate creation of pseudo-journalism and the instrumentalization of the term to delegitimize news media

Personas

|Personas and websites may be created with the intention of presenting and spreading incorrect information in a way that makes it appear credible. A faked website may present itself as being from a professional or educational organization. A person may imply that they have credentials or expertise. Disinformation actors may create whole networks of interconnected supposed "authorities". Whether we assume that someone is truthful and whether we choose to fact-check what we see are predictors of susceptibility to disinformation. Carefully consider sources and claims of authority; cross-check information against a wide range of sources.

Propaganda

|Organized mass communication, on a hidden agenda, and with a mission to conform belief and action by circumventing individual reasoning

Pseudoscience

|Accounts that claim the explanatory power of science, borrow its language and legitimacy but diverge substantially from its quality criteria

Rumors

|Unsubstantiated news stories that circulate while not corroborated or validated

Trolling

|Networked groups of digital influencers that operate ‘click armies' designed to mobilize public sentiment

An app called "Dawn of Glad Tidings," developed by Islamic State members, assists in the organization's efforts to rapidly disseminate disinformation in social media channels. When a user downloads the app, they are prompted to link it to their Twitter account and grant the app access to tweeting from their personal account. This app allows for automated Tweets to be sent out from real user accounts and helps create trends across Twitter that amplify disinformation produced by the Islamic State on an international scope.

In many cases, individuals and companies in different countries are paid to create false content and push disinformation, sometimes earning both payments and advertising revenue by doing so. "Disinfo-for-hire actors" often promote multiple issues, or even multiple sides in the same issue, solely for material gain.{{cite news |last1=Fisher |first1=Max |title=Disinformation for Hire, a Shadow Industry, Is Quietly Booming |url=https://www.nytimes.com/2021/07/25/world/europe/disinformation-social-media.html |access-date=27 January 2023 |work=The New York Times |date=25 July 2021}} Others are motivated politically or psychologically.

File:20231030 Digital media business model.svg s of digital platforms, like YouTube, work in three parts: (1) Information and entertainment (infotainment) is provided at low cost, (2) in exchange of user attention and user surveillance data, (3) this information is then monetized through targeted ads. ]]

More broadly, the monetization practices of social media and online advertising can be exploited to amplify disinformation.{{Cite journal |last=Diaz Ruiz |first=Carlos A. |date=2024-10-30 |title=Disinformation and fake news as externalities of digital advertising: a close reading of sociotechnical imaginaries in programmatic advertising |journal=Journal of Marketing Management |language=en |pages=1–23 |doi=10.1080/0267257X.2024.2421860 |issn=0267-257X|doi-access=free }} Social media's business model can be used to spread disinformation Media outlets (1) provide content to the public at little or no cost, (2) capture and refocus public attention and (3) collect, use and resell user data. Advertising companies, publishers, influencers, brands, and clients may benefit from disinformation in a variety of ways.

In 2022, the Journal of Communication published a study of the political economy underlying disinformation around vaccines. Researchers identified 59 English-language "actors" that provided "almost exclusively anti-vaccination publications". Their websites monetized disinformation through appeals for donations, sales of content-based media and other merchandise, third-party advertising, and membership fees. Some maintained a group of linked websites, attracting visitors with one site and appealing for money and selling merchandise on others. In how they gained attention and obtained funding, their activities displayed a "hybrid monetization strategy". They attracted attention by combining eye-catching aspects of "junk news" and online celebrity promotion. At the same time, they developed campaign-specific communities to publicize and legitimize their position, similar to radical social movements.{{cite journal |last1=Herasimenka |first1=Aliaksandr |last2=Au |first2=Yung |last3=George |first3=Anna |last4=Joynes-Burgess |first4=Kate |last5=Knuutila |first5=Aleksi |last6=Bright |first6=Jonathan |last7=Howard |first7=Philip N |title=The political economy of digital profiteering: communication resource mobilization by anti-vaccination actors |journal=Journal of Communication |date=24 December 2022 |volume=73 |issue=2 |pages=126–137 |doi=10.1093/joc/jqac043 |pmid=37016634 |pmc=10066223 |url=https://doi.org/10.1093/joc/jqac043 |access-date=27 January 2023}}

= Social engineering =

Emotion is used and manipulated to spread disinformation and false beliefs. Arousing emotions can be persuasive. When people feel strongly about something, they are more likely to see it as true. Emotion can also cause people to think less clearly about what they are reading and the credibility of its source. Content that appeals to emotion is more likely to spread quickly on the internet. Fear, confusion, and distraction can all interfere with people's ability to think critically and make good decisions.

Human psychology is leveraged to make disinformation attacks more potent and viral. Psychological phenomena, such as stereotyping, confirmation bias, selective attention, and echo chambers, contribute to the virality and success of disinformation on digital platforms.{{Cite journal |last=Buchanan |first=Tom |date=2020-10-07 |editor-last=Zhao |editor-first=Jichang |title=Why do people spread false information online? The effects of message and viewer characteristics on self-reported likelihood of sharing social media disinformation |journal=PLOS ONE |language=en |volume=15 |issue=10 |pages=e0239666 |bibcode=2020PLoSO..1539666B |doi=10.1371/journal.pone.0239666 |issn=1932-6203 |pmc=7541057 |pmid=33027262 |doi-access=free}} Disinformation attacks are often considered a type of psychological warfare because of their use of psychological techniques to manipulate populations.{{Cite journal |last=Thomas |first=Timothy L. |date=2020 |title=Information Weapons: Russia's Nonnuclear Strategic Weapons of Choice |url=https://www.jstor.org/stable/26923527 |journal=The Cyber Defense Review |volume=5 |issue=2 |pages=125–144 |doi= |issn=2474-2120 |jstor=26923527}}

Perceptions of identify and a sense of belonging are manipulated so as to influence people. Feelings of social belonging are reinforced to encourage affiliation with a group and discourage dissent. This can make people more susceptible to an influencer or leader who may encourage his "engaged followership" to attack others. The type of behavior has been compared to the collective behavior of mobs and is similar to dynamics within cults.{{cite journal |last1=Haslam |first1=S. |last2=Reicher |first2=S. |date=13 October 2017 |title=50 Years of "Obedience to Authority": From Blind Conformity to Engaged Followership |url=https://www.annualreviews.org/doi/abs/10.1146/annurev-lawsocsci-110316-113710 |journal=Annual Review of Law and Social Science |volume=13 |pages=59–78 |doi=10.1146/ANNUREV-LAWSOCSCI-110316-113710|url-access=subscription }}{{cite news |last1=Pazzanese |first1=Christina |date=8 May 2020 |title=Social media used to spread, create COVID-19 falsehoods |work=Harvard Gazette |url=https://news.harvard.edu/gazette/story/2020/05/social-media-used-to-spread-create-covid-19-falsehoods/ |access-date=16 May 2023}}

Defense measures

As has been noted by the Knight First Amendment Institute at Columbia University, "The misinformation problem is social and not just technological or legal." It raises serious ethical issues about how we engage with each other. The 2023 Summit on "Truth, Trust, and Hope", held by the Nobel Committee and the US National Academy of Science, identified disinformation as more dangerous than any other crisis, because of the way in which it hampers the addressing and resolution of all other problems.{{cite news |last1=Pruett |first1=Dave |title=Targeting Disinformation At The Nobel Summit |url=https://www.dnronline.com/opinion/targeting-disinformation-at-the-nobel-summit/article_92fccb6c-3fd8-57f6-85fc-eed3e33b9772.html |access-date=9 January 2024 |work=Daily News-Record |date=23 June 2023 |language=en}}

Defensive measures against disinformation can occur at a wide variety of levels, in diverse societies, under different laws and conditions. Responses to disinformation can involve institutions, individuals, and technologies, including government regulation, self-regulation, monitoring by third parties, the actions of private actors, the influence of crowds, and technological changes to platform architecture and algorithmic behaviors.{{cite news |last1=Schmitt |first1=Carolyn |title=Three new ideas for mitigating disinformation |url=https://medium.com/berkman-klein-center/three-new-ideas-for-mitigating-disinformation-9d823738a935 |work=Berkman Klein Center Collection |date=Mar 31, 2020 |language=en}} Advanced systems that involve blockchain technoloigies, crowd wisdom and artificial intelligence were developed to fight against online disinformation. It is also important to develop and share best practices for countering disinformation and building resilience against it.

Existing social, legal and regulatory guidelines may not apply easily to actions in an international virtual world, where private corporations compete for profitability, often on the basis of user engagement.{{cite journal |last1=Shapiro |first1=Susan P. |title=To Tell the Truth, the Whole Truth, and Nothing but the Truth: Truth Seeking and Truth Telling in Law (and Other Arenas) |journal=Annual Review of Law and Social Science |date=18 October 2022 |volume=18 |issue=1 |pages=61–79 |doi=10.1146/annurev-lawsocsci-050520-100547 |s2cid=248258774 |language=en |issn=1550-3585|doi-access=free }}{{cite journal |last1=Diaz Ruiz |first1=Carlos |date=30 October 2023 |title=Disinformation on digital media platforms: A market-shaping approach |journal=New Media & Society |volume=27 |issue=4 |pages=2188–2211 |language=en |doi=10.1177/14614448231207644 |s2cid=264816011 |issn=1461-4448|doi-access=free }} Ethical concerns apply to some of the possible responses to disinformation, as people debate issues of content moderation, free speech, the right to personal privacy, human identity, human dignity, suppression of human rights and religious freedom, and the use of data.{{cite web |last1=Thacker |first1=Jason |title=4 Ethical Issues Including Fake News, Misinformation, Conspiracy Theories, and Hate Speech |url=https://churchgrowthmagazine.com/4-ethical-issues-including-fake-news-misinformation-conspiracy-theories-and-hate-speech/ |website=Church Growth Magazine |access-date=26 January 2023|date=2022}} The scope of the problem means that "Building resilience to and countering manipulative information campaigns is a whole-of-society endeavor."

= National and international laws =

While authoritarian regimes have chosen to use disinformation attacks as a policy tool, their use poses specific dangers for democratic governments: using equivalent tactics will further deepen public distrust of political processes and undermine the basis of democratic and legitimate government. "Democracies should not seek to covertly influence public debate either by deliberately spreading information that is false or misleading or by engaging in deceptive practices, such as the use of fictitious online personas."

Further, democracies are encouraged to play to their strengths, including rule of law, respect for human rights, cooperation with partners and allies, soft power, and technical capability to address cyber threats.{{cite news |last1=Brandt |first1=Jessica |title=How Democracies Can Win an Information Contest Without Undercutting Their Values |url=https://carnegieendowment.org/posts/2021/08/how-democracies-can-win-an-information-contest-without-undercutting-their-values?lang=en |access-date=24 January 2023 |work=Carnegie Endowment for International Peace |date=August 2, 2021 |language=en}}

The constitutional norms that govern a society are needed both to make governance effective and to avert tyranny. Providing accurate information and countering disinformation are legitimate activities of government. The OECD suggests that public communication of policy responses should follow open government principles of integrity, transparency, accountability and citizen participation.

A discussion of the US government's ability to legally respond to disinformation argues that responses should be based on principles of transparency and generality. Responses should avoid ad hominem attacks, racial appeals, or selectivity in the person responded to. Criticism should focus first on providing correct information and secondarily on explaining why the false information is wrong, rather than focusing on the speaker or repeating the false narrative.

In the case of the COVID-19 pandemic, multiple factors created "space for misinformation to proliferate". Government responses to this public health issue indicate several areas of weakness including gaps in basic public health knowledge, lack of coordination in government communication, and confusion about how to address a situation involving significant uncertainty. Lessons from the pandemic include the need to admit uncertainty when it exists, and to distinguish clearly between what is known and what is not yet known. Science is a process, and it is important to recognize and communicate that scientific understanding and related advice will change over time on the basis of new evidence.{{cite news |title=Transparency, communication and trust: The role of public communication in responding to the wave of disinformation about the new Coronavirus |url=https://www.oecd.org/coronavirus/policy-responses/transparency-communication-and-trust-the-role-of-public-communication-in-responding-to-the-wave-of-disinformation-about-the-new-coronavirus-bef7ad6e/ |access-date=25 January 2023 |work=OECD Policy Responses to Coronavirus (COVID-19) |agency=Organisation for Economic Co-operation and Development |date=3 July 2020 |language=en}}

Regulation of disinformation raises ethical issues. The right to freedom of expression is recognized as a human right in the Universal Declaration of Human Rights and international human rights law by the United Nations. Many countries have constitutional law that protects free speech. A country's laws may identify specific categories of speech that are or are not protected, and specific parties whose actions are restricted.

== United States ==

The First Amendment to the United States Constitution protects both freedom of speech and freedom of the press from interference by the United States Congress. As a result, the regulation of disinformation in the United States tends to be left to private rather than government action.

The First Amendment does not protect speech used to incite violence or break the law,{{cite web |last1=Smith |first1=Elizabeth |last2=Zelman |first2=Johanna |title=The First Amendment: Where it is Implicated, and Where it is Not |url=https://www.jdsupra.com/legalnews/the-first-amendment-where-it-is-3482126/ |website=JD Supra |access-date=24 January 2023 |language=en}} or "obscenity, child pornography, defamatory speech, false advertising, true threats, and fighting words".

With these exceptions, debating matters of "public or general interest" in a way that is "uninhibited, robust and wide-open" is expected to benefit a democratic society.

The First Amendment tends to rely on counterspeech as a workable corrective measure, preferring refutation of falsehood to regulation.{{cite web |last1=Greene |first1=Jamal |title=Government Counterspeech |url=https://knightcolumbia.org/content/government-counterspeech |website=Knight First Amendment Institute, Columbia University|date=December 16, 2022 |access-date=24 January 2023 |language=en}}

There is an underlying assumption that identifiable parties will have the opportunity to share their views on a relatively level playing field, where a public figure being drawn into a debate will have increased access to the media and a chance of rebuttal.{{Cite journal |author-first1=Thomas E. |author-last1=Kadri |author-first2=Kate |author-last2=Klonick |url=https://southerncalifornialawreview.com/2019/11/01/facebook-v-sullivan-public-figures-and-newsworthiness-in-online-speech-article-by-thomas-e-kadri-kate-klonick/|title=Facebook v. Sullivan: Public Figures and Newsworthiness in Online Speech |date=November 1, 2019 |journal=Southern California Law Review |volume= 93 |issue= 1 |language=en-US |access-date=2020-04-26}} This may no longer hold true when rapid, massive disinformation attacks are deployed against an individual or group through anonymous or multiple third parties, where "A half-day's delay is a lifetime for an online lie."

Other civil and criminal laws are intended to protect individuals and organizations in cases where speech involves defamation of character (libel or slander) or fraud. In such cases, being incorrect is not sufficient to justify legal or governmental action. Incorrect information must demonstrably cause harm to others or enable the liar to gain an unjustified benefit. Someone who has knowingly spread disinformation and used that disinformation to gain money may be chargeable with fraud.{{cite news |last1=Ross |first1=Catherine J. |title=A Right to Lie? New Book Explores Complex Constitutional Questions |url=https://www.law.gwu.edu/right-lie-new-book-explores-complex-constitutional-questions |access-date=25 January 2023 |work=www.law.gwu.edu |date=2021 |language=en}}

The extent to which these existing laws can be effectively applied against disinformation attacks is unclear.{{cite web |last1=Brannon |first1=Valerie C. |title=False Speech and the First Amendment: Constitutional Limits on Regulating Misinformation |url=https://crsreports.congress.gov/product/pdf/IF/IF12180 |website=Congressional Research Service | date=August 1, 2022 |access-date=25 January 2023}} Under this approach, a subset of disinformation, which is not only untrue but "communicated for the purpose of gaining profit or advantage by deceit and causes harm as a result" could be considered "fraud on the public", and no longer considered a type of protected speech. Much of the speech that constitutes disinformation would not meet this test.

== European Union ==

The Digital Services Act (DSA) is a Regulation in EU law that establishes a legal framework within the European Union for the management of content on intermediaries, including illegal content, transparent advertising, and disinformation.{{Cite web|last=Espinoza|first=Javier|title=Internal Google document reveals campaign against EU lawmakers|url=https://www.ft.com/content/d9d05b1e-45c0-44b8-a1ba-3aa6d0561bed |url-access=subscription |date=28 October 2020|website=Financial Times}}

The European Parliament approved the DSA along with the Digital Markets Act on 5 July 2022.{{Cite web|date=2022-07-05|title=Digital Services: landmark rules adopted for a safer, open online environment|access-date=2022-07-05|language=en|url=https://www.europarl.europa.eu/news/en/press-room/20220701IPR34364/digital-services-landmark-rules-adopted-for-a-safer-open-online-environment |website=European Parliament }} The European Council gave its final approval to the Regulation on a Digital Services Act on 4 October 2022.{{cite web | url=https://www.lexology.com/library/detail.aspx?g=297caa4e-686c-452c-a527-c56fb3fba7ee | title=The EU Digital Services Act - Europe's New Regime for Content Moderation | date=4 October 2022 |website=Lexology |agency= Morrison & Foerster LLP |first1=Christoph |last1=Nüßing |first2=Theresa |last2=Oehm |first3=Dominik |last3=Arncken |first4=Andreas |last4=Grünwald |url-status=live |archive-url=https://web.archive.org/web/20230609155833/https://www.lexology.com/library/detail.aspx?g=297caa4e-686c-452c-a527-c56fb3fba7ee |archive-date= Jun 9, 2023 }} It was published in the Official Journal of the European Union on the 19 October 2022. Affected service providers will have until 1 January 2024 to comply with its provisions. DSA aims to harmonise differing laws at the national level in the European Union{{Cite web|last=Stolton|first=Samuel|date=2020-08-18|title=Digital agenda: Autumn/Winter Policy Briefing|url=https://www.euractiv.com/section/digital/news/digital-agenda-autumn-winter-policy-briefing/|access-date=2020-09-02|website=Euractiv |language=en-GB}} including Germany (NetzDG), Austria ("Kommunikationsplattformen-Gesetz") and France ("Loi Avia").{{Cite web |title=Primacy of EU law |url=https://eur-lex.europa.eu/EN/legal-content/glossary/primacy-of-eu-law.html |access-date=2022-04-10 |website=EUR-Lex |language=en |url-status=dead |archive-url=https://web.archive.org/web/20220602155629/https://eur-lex.europa.eu/EN/legal-content/glossary/primacy-of-eu-law.html |archive-date= Jun 2, 2022 }} Platforms with more than 45 million users in the European Union, including Facebook, YouTube, Twitter and TikTok would be subject to the new obligations. Companies failing to meet those obligations could risk fines of up to 10% of their annual turnover.{{cite magazine |url= https://time.com/5921760/europe-digital-services-act-big-tech/| title= How the E.U's Sweeping New Regulations Against Big Tech Could Have an Impact Beyond Europe | last= Perrigo | first=Billy |magazine=Time | date=December 15, 2020 |access-date=December 29, 2020}}

As of April 25, 2023, Wikipedia was one of 17 platforms to be designated a Very Large Online Platform (VLOP) by the EU Commission, with regulations taking effect as of August 25, 2023.{{cite news |last1=Brodkin |first1=Jon |title=EU names 19 large tech platforms that must follow Europe's new Internet rules |url=https://arstechnica.com/tech-policy/2023/04/google-runs-5-of-the-19-platforms-that-must-follow-eus-new-internet-rules/ |access-date=10 May 2023 |work=Ars Technica |date=25 April 2023 |language=en-us}} In addition to any steps taken by the Wikimedia Foundation, Wikipedia's compliance with the Digital Services Act will be independently audited, on a yearly basis, beginning in 2024.{{cite news |last1=Bradley-Schmieg |first1=Phil |title=Wikipedia is now a Very Large Online Platform (VLOP) under new European Union rules: Here's what that means for Wikimedians and readers |url=https://diff.wikimedia.org/2023/05/04/wikipedia-is-now-a-very-large-online-platform-vlop-under-new-european-union-rules-heres-what-that-means-for-wikimedians-and-readers/ |access-date=10 May 2023 |work=Diff |date=4 May 2023}}

==Russia and China==

It has been suggested that China and Russia are jointly portraying the United States and the European Union in an adversarial way in terms of the use of information and technology. This narrative is then used by China and Russia to justify the restriction of freedom of expression, access to independent media, and internet freedoms. They have jointly called for the "internationalization of internet governance", meaning distribution of control of the internet to individual sovereign states. In contrast, calls for global internet governance emphasize the existence of a free and open internet, whose governance involves citizens and civil society.{{cite news |last1=Bandurski |first1=David |title=China and Russia are joining forces to spread disinformation |url=https://www.brookings.edu/techstream/china-and-russia-are-joining-forces-to-spread-disinformation/ |access-date=26 January 2023 |work=Brookings |date=11 March 2022}} Democratic governments need to be aware of the potential impact of measures used to restrict disinformation both at home and abroad. This is not an argument that should block legislation, but it should be taken into consideration when forming legislation.

= Private regulation =

In the United States, the First Amendment limits the actions of Congress, not those of private individuals, companies and employers. Private entities can establish their own rules (subject to local and international laws) for dealing with information.{{cite web |title=First Amendment and Censorship |url=https://www.ala.org/advocacy/intfreedom/censorship |website=Advocacy, Legislation & Issues |access-date=25 January 2023 |language=en |date=13 June 2008}} Social media platforms like Facebook, Twitter and Telegram could legally establish guidelines for moderation of information and disinformation on their platforms. Ideally, platforms should attempt to balance free expression by their users against the moderation or removal of harmful and illegal speech.{{cite news |last1=Fellmeth |first1=Robert C. |title=Social media must balance 'right of free speech' with audience 'right to know' |url=https://thehill.com/opinion/technology/3814620-social-media-must-balance-right-of-free-speech-with-audience-right-to-know/ |access-date=24 January 2023 |work=The Hill |date=20 January 2023}}{{cite news |last1=Nossel |first1=Suzanne |author-link1=Suzanne Nossel |title=Social Media, Free Speech, and the Scourge of Misinformation |url=https://www.aft.org/hc/fall2020/nossel |access-date=24 January 2023 |work=American Federation of Teachers |date=29 April 2021 |language=en}}

Sharing of information through broadcast media and newspapers has been largely self-regulating. It has relied on voluntary self-governance and standard-setting by professional organizations such as the US Society of Professional Journalists (SPJ). The SPJ has a code of ethics for professional accountability, which includes seeking and reporting truth, minimizing harm, accountability and transparency. The code states that "whoever enjoys a special measure of freedom, like a professional journalist, has an obligation to society to use their freedoms and powers responsibly."{{Cite book|last1=Straubhaar|first1=Joseph|title=Media now: Understanding media, culture, and technology|last2=LaRose|first2=Robert|last3=Davenport|first3=Lucinda|publisher=Cengage Learning|year=2010|isbn=978-1-305-08035-5|location=Boston, MA|pages=477–479|language=English}} Anyone can write a letter to the editor of the New York Times, but the Times will not publish that letter unless they choose to do so.{{cite web |title=The New York Times on the Web: Help Center |url=https://archive.nytimes.com/www.nytimes.com/info/help/letters.html |website=The New York Times |access-date=26 January 2023}}

Arguably, social media platforms are treated more like the post office—which passes along information without reviewing it—than they are like journalists and print publishers who make editorial decisions and are expected to take responsibility for what they publish. The kinds of ethical, social and legal frameworks that journalism and print publishing have developed have not been applied to social media platforms.{{cite news |last1=White |first1=Aidan |title=Ethics in the News - Fake News and Facts in the Post-Truth Era |url=https://ethicaljournalismnetwork.org/fake-news |access-date=26 January 2023 |work=Ethical Journalism Network |date=18 December 2016 |language=en}}

It has been pointed out that social media platforms like Facebook and Twitter lack incentives to control disinformation or to self-regulate.{{cite news |last1=Bauder |first1=David |last2=Liedtke |first2=Michael |title=Whistleblower says Facebook routinely chose 'profit over safety' when it came to misinformation |url=https://fortune.com/2021/10/04/facebook-whistleblower-social-media-misinformation-hate-algorithm/ |access-date=26 January 2023 |work=Fortune |date=October 4, 2021}} To the extent that platforms rely on advertising for revenue, it is to their financial benefit to maximize user engagement, and the attention of users is demonstrably captured by sensational content.{{cite news |title=Doomscrolling and negativity bias: The way we consume news may be detrimental to our health |url=https://whatsnewinpublishing.com/doomscrolling-and-negativity-bias-the-way-we-consume-news-may-be-detrimental-to-our-health/ |access-date=27 January 2023 |work=What's New in Publishing {{!}} Digital Publishing News |date=26 October 2020}} Algorithms that push content based on user search histories, frequent clicks and paid advertising leads to unbalanced, poorly sourced, and actively misleading information. It is also highly profitable.{{cite news |last1=Sutcliffe |first1=Chris |title='Disinformation is a business': media execs explore how to demonetize falsehoods |url=https://www.thedrum.com/news/2021/10/01/disinformation-business-media-execs-explore-how-demonetize-falsehoods |access-date=26 January 2023 |work=The Drum |date=October 1, 2021}} When countering disinformation, the use of algorithms for monitoring content is cheaper than employing people to review and fact-check content. People are more effective at detecting disinformation. People may also bring their own biases (or their employer's biases) to the task of moderation.

Privately owned social media platforms such as Facebook and Twitter can legally develop regulations, procedures and tools to identify and combat disinformation on their platforms.{{cite web |last1=Brannon |first1=Valerie C. |title=Free Speech and the Regulation of Social Media Content |website=Congressional Research Service |date=March 27, 2019 |url=https://sgp.fas.org/crs/misc/R45650.pdf |access-date=24 January 2023}}

For example, Twitter can use machine learning applications to flag content that does not comply with its terms of service and identify extremist posts encouraging terrorism. Facebook and Google have developed a content hierarchy system where fact-checkers can identify and de-rank possible disinformation and adjust algorithms accordingly. Companies are considering using procedural legal systems to regulate content on their platforms as well. Specifically, they are considering using appellate systems: posts may be taken down for violating terms of service and posing as a disinformation threat, but users can contest this action through a hierarchy of appellate bodies.

Blockchain technology has been suggested as a potential defense mechanism against internet manipulation.{{cite journal |last1=Dhall |first1=Sakshi |last2=Dwivedi |first2=Ashutosh Dhar |last3=Pal |first3=Saibal K. |last4=Srivastava |first4=Gautam |title=Blockchain-based Framework for Reducing Fake or Vicious News Spread on Social Media/Messaging Platforms |journal=ACM Transactions on Asian and Low-Resource Language Information Processing |date=1 November 2021 |volume=22 |issue=1 |pages=8:1–8:33 |doi=10.1145/3467019 |s2cid=240462042 |url=https://dl.acm.org/doi/10.1145/3467019 |access-date=24 January 2023 |issn=2375-4699|url-access=subscription }}

While blockchain was originally developed to create a ledger of transactions for the digital currency bitcoin, it is now widely used in applications where a permanent record or history of assets, transactions, and activities is desired. It provides a potential for transparency and accountability,{{cite news |last1=Chavez-Dreyfuss |first1=Gertrude |title=Ukraine launches big blockchain deal with tech firm Bitfury |url=https://www.reuters.com/article/us-ukraine-bitfury-blockchain-idUSKBN17F0N2 |access-date=24 January 2023 |work=Reuters |date=17 April 2017 |language=en}}

Blockchain technology could be applied to make data transport more secure in online spaces and the Internet of Things networks, making it difficult for actors to alter or censor content and carry out disinformation attacks.{{Cite journal|last=Sultan|first=Oz|date=2019|title=Tackling Disinformation, Online Terrorism, and Cyber Risks into the 2020s|url=https://www.jstor.org/stable/26623066|journal=The Cyber Defense Review|volume=4|issue=1|pages=43–60|jstor=26623066|issn=2474-2120}}

Applying techniques such as blockchain and keyed watermarking on social media/messaging platforms could also help to detect and curb disinformation attacks. The density and rate of forwarding of a message could be observed to detect patterns of activity that suggest the use of bots and fake account activity in disinformation attacks. Blockchain could support both backtracking and forward tracking of events that involve the spreading of disinformation. If the content is deemed dangerous or inappropriate, its spread could be curbed immediately.

Understandably, methods for countering disinformation that involve algorithmic governance raise ethical concerns. The use of technologies that track and manipulate information raises questions about "who is accountable for their operation, whether they can create injustices and erode civic norms, and how we should resolve their (un)intended consequences".{{cite journal |last1=Gritsenko |first1=Daria |last2=Wood |first2=Matthew |title=Algorithmic governance: A modes of governance approach |journal=Regulation & Governance |date=January 2022 |volume=16 |issue=1 |pages=45–62 |doi=10.1111/rego.12367 |s2cid=228833298 |url=https://doi.org/10.1111/rego.12367 |access-date=24 January 2023 |language=en |issn=1748-5983|hdl=10138/356017 |hdl-access=free }}{{cite journal |last1=Yeung |first1=Karen |title=Algorithmic regulation: A critical interrogation: Algorithmic Regulation |journal=Regulation & Governance |date=December 2018 |volume=12 |issue=4 |pages=505–523 |doi=10.1111/rego.12158 |s2cid=157086008 |language=en|doi-access=free }}{{cite book |last1=Pasquale |first1=Frank |title=Black box society: the secret algorithms that control money and information |date=2016 |publisher=Harvard University Press |location=Cambridge, Massachusetts |isbn=9780674970847 |edition=First Harvard University Press paperback}}

A study from the Pew Research Center reports that public support for restriction of disinformation by both technology companies and government increased among Americans from 2018 to 2021. However, views on whether government and technology companies should take such steps became increasingly partisan and polarized during the same time period.{{cite news |last1=Mitchell |first1=Amy |last2=Walker |first2=Mason |title=More Americans now say government should take steps to restrict false information online than in 2018 |url=https://www.pewresearch.org/fact-tank/2021/08/18/more-americans-now-say-government-should-take-steps-to-restrict-false-information-online-than-in-2018/ |access-date=25 January 2023 |work=Pew Research Center |date=August 18, 2021}}

= Collaborative measures =

Cyber security experts claim that collaboration between public and private sectors is necessary to successfully combat disinformation attacks. Recommended cooperative defense strategies include:

  • The creation of "disinformation detection consortiums" where stakeholders (i.e. private social media companies and governments) convene to discuss disinformation attacks and come up with mutual defense strategies.
  • Sharing critical information between private social media companies and the government, so that more effective defense strategies can be developed.{{Cite journal|last=White|first=Adam J.|date=2018|title=Google.gov: Could an alliance between Google and government to filter facts be the looming progressive answer to "fake news"?|url=https://www.jstor.org/stable/26487781|journal=The New Atlantis|issue=55|pages=3–34|jstor=26487781|issn=1543-1215}}
  • Coordination among governments to create a unified and effective response against transnational disinformation campaigns.
  • Combining human intelligence (crowd wisdom) and artificial intelligence in a single system to better analyze, detect and expose disinformation.

However, in the United States, the Republican party is actively opposing both disinformation research and government involvement in fighting disinformation. Republicans gained a majority in the House in January 2023. Since then, the House Judiciary Committee has used legal action to send letters, subpoenas, and threats of legal action to researchers, demanding notes, emails and other records from researchers and even student interns, dating back to 2015. Institutions affected include the Stanford Internet Observatory at Stanford University, the University of Washington, the Atlantic Council's Digital Forensic Research Lab and the social media analytics firm Graphika. Projects include the Election Integrity Partnership, formed to identify attempts "to suppress voting, reduce participation, confuse voters or delegitimize election results without evidence"{{cite news |last1=Myers |first1=Steven Lee |last2=Frenkel |first2=Sheera |title=G.O.P. Targets Researchers Who Study Disinformation Ahead of 2024 Election |url=https://www.nytimes.com/2023/06/19/technology/gop-disinformation-researchers-2024-election.html |work=The New York Times |date=19 June 2023}} and the Virality Project, which has examined the spread of false claims about vaccines. Researchers argue that they have academic freedom to study social media and disinformation as well as freedom of speech to report their results.{{cite news |last1=Starks |first1=Tim |title=Analysis {{!}} GOP legal attacks create a chilling effect on misinformation research |url=https://www.washingtonpost.com/politics/2023/09/25/gop-legal-attacks-create-chilling-effect-misinformation-research/ |newspaper=The Washington Post |date=25 September 2023}}{{cite news |last1=Nix |first1=Naomi |last2=Zakrzewski |first2=Cat |last3=Menn |first3=Joseph |date=23 September 2023 |title=Misinformation research is buckling under GOP legal attacks |url=https://www.washingtonpost.com/technology/2023/09/23/online-misinformation-jim-jordan/ |newspaper=The Washington Post |language=en}} Despite conservative claims that the government acted to censor speech online, "no evidence has emerged that government officials coerced the companies to take action against accounts".

At the state level, state governments that were politically aligned with anti-vaccine activists successfully sought a preliminary injunction to prevent the Biden Administration from urging social media companies to fight misinformation about public health. The order issued by United States Court of Appeals for the Fifth Circuit in 2023 "severely limits the ability of the White House, the surgeon general, [and] the Centers for Disease Control and Prevention... to communicate with social media companies about content related to COVID-19... that the government views as misinformation".{{Cite web|url=https://www.cnn.com/2023/10/03/tech/cybersecurity-and-infrastructure-security-agency-social-media-lawsuit-injunction/index.html|title=Federal appeals court extends limits on Biden administration communications with social media companies to top US cybersecurity agency|first=Devan|last=Cole|date=October 3, 2023|website=CNN}}

= Strengthening civil society =

Reports on disinformation in Armenia and Asia identify key issues and make recommendations. These can be applied to many other countries, particularly those experiencing "both profound disruption and an opportunity for change". The report emphasizes the importance of strengthening civil society by protecting the integrity of elections and rebuilding trust in public institutions. Steps to support the integrity of elections include: ensuring a free and fair process, allowing independent observation and monitoring, allowing independent journalistic access, and investigating electoral infractions. Other suggestions include rethinking state communication strategies to enable all levels of government to more effectively communicate and to address disinformation attacks.

National dialogue bringing together diverse public, community, political, state and nonstate actors as stakeholders is recommended for effective long-term strategic planning. Creating a unified strategy for legislation to deal with information spaces is recommended. Balancing concerns about freedom of expression with protections for individuals and democratic institutions is critical.

Another concern is the development of a healthy information environment that supports fact-based journalism, truthful discourse, and independent reporting at the same time that it rejects information manipulation and disinformation. Key issues for the support of resilient independent media include transparency of ownership, financial viability, editorial independence, media ethics and professional standards, and mechanisms for self-regulation.{{cite news |last1=Zach |first1=Elizabeth |title=In Armenia, factchecking disinformation on politics and war |url=https://akademie.dw.com/en/in-armenia-factchecking-disinformation-on-politics-and-war/a-61757306 |access-date=21 January 2023 |work=Deutsche Welle |date=May 12, 2022}}{{cite news |last1=Scott |first1=Mark |title=Fringe platforms sidestep Europe's disinformation playbook |url=https://www.politico.eu/article/european-commission-disinformation-code/ |access-date=21 January 2023 |work=POLITICO |date=14 June 2022}}{{cite news |title=The 2022 Code of Practice on Disinformation |url=https://digital-strategy.ec.europa.eu/en/policies/code-practice-disinformation |access-date=21 January 2023 |work=The 2022 Code of Practice on Disinformation {{!}} Shaping Europe's digital future |agency=European Commission |date=2022 |language=en}}

During the 2018 Mexican general election, the collaborative journalism project Verificado 2018 was established to address misinformation. It involved at least eighty organizations, including local and national media outlets, universities and civil society and advocacy groups. The group researched online claims and political statements and published joint verifications. During the course of the election, they produced over 400 notes and 50 videos documenting false claims and suspect sites, and tracked instances where fake news went viral.{{cite news |last1=Armstrong |first1=Mia |title=Mexico's Chapter in the Saga of Election Disinformation |url=https://slate.com/technology/2018/08/mexicos-presidential-election-was-rife-with-disinformation-from-inside-the-country.html |work=Slate |date=2 August 2018}} Verificado.mx received 5.4 million visits during the election, with its partner organizations registering millions more.{{cite book |last1=Bandeira |first1=Luiza |last2=Barojan |first2=Donara |last3=Braga |first3=Roberta |last4=Peñarredonda |first4=Jose Luis |last5=Argüello |first5=Maria Fernanda Pérez |title=Disinformation in Democracies: Strengthening Digital Resilience in Latin America |date=2019 |publisher=Atlantic Council |location=Washington, D.C. |isbn=978-1-61977-524-4 |pages=20–29 |url=https://www.atlanticcouncil.org/in-depth-research-reports/report/disinformation-democracies-strengthening-digital-resilience-latin-america/ }}{{rp|25}} To deal with the sharing of encrypted messages via WhatsApp, Verificado set up a hotline where WhatsApp users could submit messages for verification and debunking. Over 10,000 users subscribed to Verificado's hotline.

{{external media | width = 210px | float = right | headerimage= | video1 = [https://www.youtube.com/watch?v=5MjhkFO1sAU&ab_channel=PENAmerica "Physical Safety Strategies for Reporters"], PEN America, Jun 12, 2020.}}

Organizations promoting civil society and democracy, independent journalists, human rights defenders, and other activists are increasingly targets of disinformation campaigns and violence. Their protection is essential. Journalists, activists and organizations can be key allies in combating false narratives, promoting inclusion, and encouraging civic engagement. Oversight and ethics bodies are also critical. Organizations that have developed resources and trainings to better support journalists against online and offline violence and violence against women include the Coalition Against Online Violence,{{cite web |title=Training |url=https://onlineviolenceresponsehub.org/training |website=Coalition Against Online Violence |access-date=30 January 2023}}{{cite web |title=Home |url=https://onlineviolenceresponsehub.org/ |website=Coalition Against Online Violence |access-date=30 January 2023}}

Knight Center for Journalism in the Americas,{{cite news |title=New free online course for women journalists and allies: Learn how to plan for reporting safely |url=https://knightcenter.utexas.edu/new-free-online-course-for-women-journalists-and-allies-learn-how-to-plan-for-reporting-safely/ |access-date=30 January 2023 |work=Knight Center for Journalism in the Americas |date=19 April 2021}}

International Women's Media Foundation,{{cite web |title=Hostile Environment Training - IWMF |url=https://www.iwmf.org/programs/hefat-training/ |website=International Women's Media Foundation |access-date=30 January 2023 |language=en}}

UNESCO,{{cite news |last1=Johnson |first1=Brian |title=UNESCO program seeks to improve journalists' safety through police training |url=https://dc.medill.northwestern.edu/blog/2020/03/19/journalist-safety/#sthash.8nlNMfBL.dpbs |access-date=30 January 2023 |work=Medill News Service |date=19 March 2020}}

PEN America,{{cite web |title=Online Abuse Defense Training Program |url=https://pen.org/online-abuse-defense-training-program/ |website=PEN America |access-date=30 January 2023 |language=en |date=2 September 2021}}

First Draft,{{cite web |title=First Draft's Essential Guides to reporting on disinformation |url=https://firstdraftnews.org/long-form-article/first-drafts-essential-guide-to/ |website=First Draft |access-date=27 February 2025}} and others.{{cite web |title=Journalist, Human Rights, Digital Safety Training |url=https://www.gjs-security.com/training/ |website=Global Journalist Security |access-date=30 January 2023}}

= Education and awareness =

{{external media | width = 210px | float = right | headerimage= | video1 = [https://www.youtube.com/watch?v=SsvhafMb3gw&list=PL5mFTWDtszx0P5bVBes0MvJuVJor_xp9L&ab_channel=YALINetwork Understanding and Countering Disinformation], YALI Network, September 29, 2022.}}

Media literacy education and information on how to identify and combat disinformation is recommended for public schools and universities. In 2022, countries in the European Union were ranked on a Media Literacy Index to measure resilience against disinformation. Finland, the highest ranking country, has developed an extensive curriculum that teaches critical thinking and resistance to information warfare, and integrated it into its public education system. Fins also rank high in trust in government authorities and the media.{{cite news |last1=Benke |first1=Erika |last2=Spring |first2=Marianna |title=US midterm elections: Does Finland have the answer to fake news? |url=https://www.bbc.com/news/world-europe-63222819 |access-date=25 January 2023 |work=BBC News |date=12 October 2022}}{{cite web |title=How It Started, How It is Going: Media Literacy Index 2022 |url=https://osis.bg/wp-content/uploads/2022/10/HowItStarted_MediaLiteracyIndex2022_ENG_.pdf |website=Open Society Institute |access-date=25 January 2023}}{{cite news |last1=Henley |first1=Jon |title=How Finland starts its fight against fake news in primary schools |url=https://www.theguardian.com/world/2020/jan/28/fact-from-fiction-finlands-new-lessons-in-combating-fake-news |access-date=27 February 2025 |work=The Guardian |date=29 January 2020}} Organizations such as Faktabaari and Mediametka develop tools and resources around information, media and voter literacy.

Following a 2007 cyberattack that included disinformation tactics, the country of Estonia focused on improving its cyberdefenses and made media literacy education a major focus from kindergarten through to high school.{{cite news |last1=Roussi |first1=Antoaneta |title=Estonia fends off 'extensive' cyberattack following Soviet monument removal |url=https://www.politico.eu/article/estonia-extensive-cyber-attack-following-soviet-war-monument-removal/ |access-date=22 December 2022 |work=POLITICO |date=18 August 2022}}

In 2018, the Executive Vice President of the European Commission for A Europe Fit for the Digital Age gathered a group of experts to produce a report with recommendations for teaching digital literacy. Proposed digital literacy curricula familiarize students with fact-checking websites such as Snopes and FactCheck.org. This curricula aims to equip students with critical thinking skills to discern between factual content and disinformation online.{{Cite journal|last=Glisson|first=Lane|date=2019|title=Breaking the Spin Cycle: Teaching Complexity in the Age of Fake News|url=https://muse.jhu.edu/article/729198|journal=Portal: Libraries and the Academy|language=en|volume=19|issue=3|pages=461–484|doi=10.1353/pla.2019.0027|s2cid=199016070|issn=1530-7131|url-access=subscription}}

Suggested areas to focus on include skills in critical thinking,{{cite journal |last1=Trecek-King |first1=Melanie |title=Inoculating Students against Misinformation by Having Them Create It |journal=Skeptical Inquirer |date=24 October 2023 |volume=47 |issue=6 |url=https://skepticalinquirer.org/2023/10/inoculating-students-against-misinformation-by-having-them-create-it/ |url-status=live |archive-url= https://web.archive.org/web/20231030171148/https://skepticalinquirer.org/2023/10/inoculating-students-against-misinformation-by-having-them-create-it/ |archive-date= Oct 30, 2023 }}

information literacy,{{cite journal |last1=De Paor |first1=Saoirse |last2=Heravi |first2=Bahareh |title=Information literacy and fake news: How the field of librarianship can help combat the epidemic of fake news |s2cid-access=free |journal=The Journal of Academic Librarianship |date=1 September 2020 |volume=46 |issue=5 |pages=102218 |doi=10.1016/j.acalib.2020.102218 |s2cid=225301320 |issn=0099-1333|doi-access=free }}{{cite web |last1=Cunningham |first1=Nancy |title= Fake News and Information Literacy: Introduction |url=https://researchguides.uoregon.edu/fakenews |date=Sep 8, 2023 |website=Research Guides at University of Oregon Libraries |language=en |url-status=live |archive-url= https://web.archive.org/web/20231031094835/https://researchguides.uoregon.edu/fakenews |archive-date= Oct 31, 2023 }} science literacy{{cite web |title=About the Report |url=https://sciedandmisinfo.stanford.edu/about-report |website=Science Education in an Age of Misinformation |publisher=Stanford |language=en |url-status=live |archive-url=https://web.archive.org/web/20231030174943/https://sciedandmisinfo.stanford.edu/about-report |archive-date= Oct 30, 2023 }} and health literacy.{{cite journal |last1=Swire-Thompson |first1=Briony |last2=Lazer |first2=David |title=Public Health and Online Misinformation: Challenges and Recommendations |journal=Annual Review of Public Health |date=2 April 2020 |volume=41 |issue=1 |pages=433–451 |doi=10.1146/annurev-publhealth-040119-094127 |pmid=31874069 |s2cid=209473873 |language=en |issn=0163-7525|doi-access=free }}

Another approach is to build interactive games such as the Cranky Uncle game, which teaches critical thinking and inoculates players against techniques of disinformation and science denial. The Cranky Uncle game is freely available and has been translated into at least 9 languages.{{cite news |last1=Murray |first1=Jessica |title=Cranky Uncle game takes on climate crisis denial and fake news |url=https://www.theguardian.com/games/2019/dec/07/cranky-uncle-game-takes-on-climate-crisis-denial-and-fake-news |work=The Guardian |date=7 December 2019 |url-status=live |archive-url=https://web.archive.org/web/20231005113317/https://www.theguardian.com/games/2019/dec/07/cranky-uncle-game-takes-on-climate-crisis-denial-and-fake-news |archive-date= Oct 5, 2023 }}{{cite web |title=Cranky Uncle game: building resilience against misinformation |url=https://crankyuncle.com/game/ |website=Cranky Uncle |url-status=live |archive-url=https://web.archive.org/web/20240131081543/https://crankyuncle.com/game/ |archive-date= Jan 31, 2024 }} Videos for teaching critical thinking and addressing disinformation can also be found online.{{cite news |title=School of Media Studies Professor Robert Berkman's Fighting Disinformation Video Series Helps Students Sort Fact from Fiction Online |url=https://blogs.newschool.edu/news/2023/01/school-of-media-studies-professor-robert-berkmans-fighting-disinformation-video-series-helps-students-sort-fact-from-fiction-online/ |work=The New School News |date=26 January 2023 |url-status=live |archive-url=https://web.archive.org/web/20231102162835/https://blogs.newschool.edu/news/2023/01/school-of-media-studies-professor-robert-berkmans-fighting-disinformation-video-series-helps-students-sort-fact-from-fiction-online/ |archive-date= Nov 2, 2023 }}{{cite web |title=Fighting Disinformation: A Six Part Series for New School Students |url=https://www.youtube.com/playlist?list=PL1sEA_rld1jTSbJENPV7Cc0CvlUGch1ug |website=YouTube |first1=Robert |last1=Berkman |date=Sep 30, 2022 |url-status=live |archive-url=https://web.archive.org/web/20231127182608/https://www.youtube.com/playlist?list=PL1sEA_rld1jTSbJENPV7Cc0CvlUGch1ug |archive-date= Nov 27, 2023 }}

Training and best practices for identifying and countering disinformation are being developed and shared by groups of journalists, scientists, and others (e.g.

Climate Action Against Disinformation,{{cite book |last1=Gibson |first1=Connor |title=Journalist Field Guide: Navigating Climate Misinformation |date=2022 |publisher=Climate Action Against Disinformation |url=https://caad.info/wp-content/uploads/2022/10/CAAD-Journalist-Field-Guide.pdf}}

PEN America,{{cite web |title=Knowing the News: A Media Literacy & Disinformation Defense Project |url=https://pen.org/knowing-the-news/ |website=PEN America |access-date=6 December 2022 |language=en |date=23 September 2020}}{{cite web |title=The Impact of Community-Based Digital Literacy Interventions on Disinformation Resilience |url=https://pen.org/report/the-impact-of-community-based-digital-literacy-interventions-on-disinformation-resilience/ |website=PEN America |access-date=6 December 2022 |language=en |date=29 September 2022}}{{cite web |title=Hard News: Journalists and the Threat of Disinformation |url=https://pen.org/report/hard-news-journalists-and-the-threat-of-disinformation/ |website=PEN America |access-date=6 December 2022 |language=en |date=14 April 2022}}

UNESCO,{{cite web |editor-first1=Cherilyn |editor-last1=Ireton |editor-first2= Julie |editor-last2= Posetti |title=Journalism, 'Fake News' and Disinformation: A Handbook for Journalism Education and Training |url=https://en.unesco.org/fightfakenews |website=UNESCO |access-date=6 December 2022 |language=en |date=3 September 2018}}

Union of Concerned Scientists,{{cite web |title=How Disinformation Works |url=https://www.ucsusa.org/resources/how-disinformation-works |website=Union of Concerned Scientists |access-date=6 December 2022 |language=en}}{{cite web |title=The Disinformation Playbook |url=https://www.ucsusa.org/resources/disinformation-playbook |website=Union of Concerned Scientists |access-date=6 December 2022 |language=en}}

Young African Leaders Initiative{{cite web |title=Course: Understanding and Countering Disinformation |url=https://yali.state.gov/courses/course-5115/#/lesson/lesson-1-disinformation-with-intent-to-harm |website=Young African Leaders Initiative |access-date=6 December 2022}}).

Research suggests that a number of tactics have proven useful against scientific disinformation around climate change. These include: 1) providing clear explanations about why climate change is occurring 2) indicating that there is scientific consensus about the existence of climate change and about its basis in human actions 3) presenting information in ways that are culturally aligned with the listener 4) "inoculating" people by clearly identifying misinformation (ideally before a myth is encountered, but also later through debunking).{{cite journal |last1=Lewandowsky |first1=Stephan |title=Climate Change Disinformation and How to Combat It |journal=Annual Review of Public Health |date=1 April 2021 |volume=42 |issue=1 |pages=1–21 |doi=10.1146/annurev-publhealth-090419-102409 |pmid=33355475 |hdl=1983/c6a6a1f8-6ba4-4a12-9829-67c14c8ae2e5 |s2cid=229691604 |url=https://www.annualreviews.org/doi/full/10.1146/annurev-publhealth-090419-102409 |access-date=6 December 2022 |issn=0163-7525|hdl-access=free }}{{cite journal |last1=Hornsey |first1=Matthew J. |last2=Lewandowsky |first2=Stephan |title=A toolkit for understanding and addressing climate scepticism |journal=Nature Human Behaviour |date=November 2022 |volume=6 |issue=11 |pages=1454–1464 |doi=10.1038/s41562-022-01463-y |pmid=36385174 |pmc=7615336 |hdl=1983/c3db005a-d941-42f1-a8e9-59296c66ec9b |s2cid=253577142 |language=en |issn=2397-3374}}

A "Toolbox of Interventions Against Online Misinformation and Manipulation" reviews research into individually-focused interventions to combat misinformation and their possible effectiveness. Tactics include:{{cite journal |last1=Kozyreva |first1=Anastasia |last2=Lorenz-Spreen |first2=Philipp |last3=Herzog |first3=Stefan Michael |last4=Ecker |first4=Ullrich K. H. |last5=Lewandowsky |first5=Stephan |last6=Hertwig |first6=Ralph |title=Toolbox of Interventions Against Online Misinformation and Manipulation |website=psyarxiv.com |date=16 December 2022 |doi=10.31234/osf.io/x8ejt |url=https://psyarxiv.com/x8ejt/ |access-date=22 December 2022}}{{cite web |title=Toolbox: Conceptual overview |url=https://interventionstoolbox.mpib-berlin.mpg.de/table_concept.html |website=Toolbox of interventions against online misinformation and manipulation |access-date=22 December 2022}}

  • Accuracy prompts – Social media and other sources of information can cue people to think about accuracy before sharing information online{{cite journal |last1=Pennycook |first1=Gordon |last2=Rand |first2=David G. |title=Accuracy prompts are a replicable and generalizable approach for reducing the spread of misinformation |journal=Nature Communications |date=28 April 2022 |volume=13 |issue=1 |pages=2333 |doi=10.1038/s41467-022-30073-5 |pmid=35484277 |pmc=9051116 |bibcode=2022NatCo..13.2333P |language=en |issn=2041-1723}}{{cite journal |last1=Epstein |first1=Ziv |last2=Berinsky |first2=Adam J. |last3=Cole |first3=Rocky |last4=Gully |first4=Andrew |last5=Pennycook |first5=Gordon |last6=Rand |first6=David G. |title=Developing an accuracy-prompt toolkit to reduce COVID-19 misinformation online |journal=Harvard Kennedy School Misinformation Review |date=18 May 2021 |doi=10.37016/mr-2020-71 |s2cid=234845514 |url=https://misinforeview.hks.harvard.edu/article/developing-an-accuracy-prompt-toolkit-to-reduce-covid-19-misinformation-online/ |access-date=22 December 2022|hdl=1721.1/138124.2 |hdl-access=free }}
  • Debunking – To expose false information, first focus on highlighting the true facts, before pointing out that misleading information is going to be given, and only then specifying the misinformation and explaining why it is wrong. Finally, the correct explanation should be reinforced.{{cite journal |last1=Ecker |first1=Ullrich K. H. |last2=Lewandowsky |first2=Stephan |last3=Cook |first3=John |last4=Schmid |first4=Philipp |last5=Fazio |first5=Lisa K. |last6=Brashier |first6=Nadia |last7=Kendeou |first7=Panayiota |last8=Vraga |first8=Emily K. |last9=Amazeen |first9=Michelle A. |title=The psychological drivers of misinformation belief and its resistance to correction |journal=Nature Reviews Psychology |date=January 2022 |volume=1 |issue=1 |pages=13–29 |doi=10.1038/s44159-021-00006-y |s2cid=245916820 |language=en |issn=2731-0574|doi-access=free |hdl=1983/889ddb0f-0d44-44f4-a54f-57c260ae4917 |hdl-access=free }}{{cite news |last1=Black |first1=Ian |title=Best Practices for Debunking Misinformation |url=https://www.labmanager.com/big-picture/effective-communication-in-academia-and-industry/best-practices-for-debunking-misinformation-28879 |access-date=22 December 2022 |work=Lab Manager |date=September 28, 2022 |language=en}} This way of countering disinformation is sometimes referred to as a "truth sandwich".{{cite news |last1=Clark |first1=Roy Peter |title=How to serve up a tasty 'truth sandwich?' The secret sauce is emphatic word order. |url=https://www.poynter.org/reporting-editing/2020/how-to-serve-up-a-tasty-truth-sandwich/ |access-date=15 May 2023 |work=Poynter |publisher=The Poynter Institute for Media Studies, Inc. |date=August 18, 2020}}
  • Avoid confrontation. Evidence suggests that when someone feels challenged or threatened by information that does not fit their existing worldview, they will "double down" on their previous beliefs rather than considering the new information. However, if clear evidence can be presented in a friendly and non-confrontational way, without arousing aggression or hostility, the new information is more likely to be considered.
  • Friction – Clickbait aimed at spreading disinformation tries to get people to react quickly and emotionally. Cueing people to slow down and think about their actions (e.g. by displaying a prompt like "Want to read this before sharing?") can limit the spread of disinformation.{{cite journal |last1=Bates |first1=Jo |title=The politics of data friction |journal=Journal of Documentation |date=1 January 2017 |volume=74 |issue=2 |pages=412–429 |doi=10.1108/JD-05-2017-0080 |url=https://eprints.whiterose.ac.uk/120075/3/The%20politics%20of%20data%20friction%20-%20final%20%281%29.pdf |access-date=22 December 2022 |issn=0022-0418}}
  • Inoculation – Preemptively warning people about possible disinformation and techniques used to spread disinformation, before they are exposed to an intended false message, can help them to identify false messages and attempts at manipulation.{{cite journal |last1=Roozenbeek |first1=Jon |last2=van der Linden |first2=Sander |last3=Goldberg |first3=Beth |last4=Rathje |first4=Steve |last5=Lewandowsky |first5=Stephan |title=Psychological inoculation improves resilience against misinformation on social media |journal=Science Advances |date=26 August 2022 |volume=8 |issue=34 |pages=eabo6254 |doi=10.1126/sciadv.abo6254 |pmid=36001675 |pmc=9401631 |bibcode=2022SciA....8O6254R |language=en |issn=2375-2548}}{{cite journal |last1=Boháček |first1=Matyáš |last2=Farid |first2=Hany |title=Protecting world leaders against deep fakes using facial, gestural, and vocal mannerisms |journal=Proceedings of the National Academy of Sciences |date=29 November 2022 |volume=119 |issue=48 |pages=e2216035119 |doi=10.1073/pnas.2216035119 |doi-access=free |pmid=36417442 |pmc=9860138 |bibcode=2022PNAS..11916035B |s2cid=253801197 |language=en |issn=0027-8424}} Short videos that describe specific tactics like fearmongering, the use of emotional language, or fake experts, help people resist online persuasion techniques.{{cite news |last1=Grant |first1=Nico |last2=Hsu |first2=Tiffany |title=Google Finds 'Inoculating' People Against Misinformation Helps Blunt Its Power |url=https://www.nytimes.com/2022/08/24/technology/google-search-misinformation.html |access-date=22 December 2022 |work=The New York Times |date=24 August 2022}}
  • Lateral reading – Fact check information by looking for independent and reputable sources. Verify the credibility of information on a website by independently searching the Web, not just looking at the original site.{{cite journal |last1=Breakstone |first1=Joel |last2=Smith |first2=Mark |last3=Connors |first3=Priscilla |last4=Ortega |first4=Teresa |last5=Kerr |first5=Darby |last6=Wineburg |first6=Sam |title=Lateral reading: College students learn to critically evaluate internet sources in an online course |journal=Harvard Kennedy School Misinformation Review |date=23 February 2021 |doi=10.37016/mr-2020-56 |s2cid=233896933 |url=https://misinforeview.hks.harvard.edu/article/lateral-reading-college-students-learn-to-critically-evaluate-internet-sources-in-an-online-course/ |access-date=22 December 2022|doi-access=free }}{{cite journal |last1=Wineburg |first1=Sam |last2=Breakstone |first2=Joel |last3=McGrew |first3=Sarah |last4=Smith |first4=Mark D. |last5=Ortega |first5=Teresa |title=Lateral reading on the open Internet: A district-wide field study in high school government classes. |journal=Journal of Educational Psychology |date=July 2022 |volume=114 |issue=5 |pages=893–909 |doi=10.1037/edu0000740 |s2cid=248190572 |url=https://doi.org/10.1037/edu0000740 |access-date=22 December 2022 |language=en |issn=1939-2176|url-access=subscription }}
  • Media-literacy tips – Specific strategies for spotting false news, such as those used in Facebook's 2017 "Tips to Spot False News" (e.g. "be sceptical of headlines", "look closely at the URL") can help users to better discriminate between real and fake news stories.{{cite journal |last1=Guess |first1=Andrew M. |last2=Lerner |first2=Michael |last3=Lyons |first3=Benjamin |last4=Montgomery |first4=Jacob M. |last5=Nyhan |first5=Brendan |last6=Reifler |first6=Jason |last7=Sircar |first7=Neelanjan |title=A digital media literacy intervention increases discernment between mainstream and false news in the United States and India |journal=Proceedings of the National Academy of Sciences |date=7 July 2020 |volume=117 |issue=27 |pages=15536–15545 |doi=10.1073/pnas.1920498117 |pmid=32571950 |pmc=7355018 |bibcode=2020PNAS..11715536G |language=en |issn=0027-8424|doi-access=free }}{{cite news |last1=Yee |first1=Amy |title=The country inoculating against disinformation |url=https://www.bbc.com/future/article/20220128-the-country-inoculating-against-disinformation |access-date=22 December 2022 |work=BBC |date=30 January 2022 |language=en}}
  • Rebuttals of science denialism – Scientific denial can involve both inaccurate assertions about a particular topic (topic rebuttal) and rhetorical techniques and strategies that undermine, mislead or deny the validity of science as an activity (technique rebuttal). Countering science denial must address both types of tactics.{{cite journal |last1=Schmid |first1=Philipp |last2=Betsch |first2=Cornelia |title=Effective strategies for rebutting science denialism in public discussions |journal=Nature Human Behaviour |date=September 2019 |volume=3 |issue=9 |pages=931–939 |doi=10.1038/s41562-019-0632-4 |pmid=31235861 |s2cid=195329680 |url=https://www.nature.com/articles/s41562-019-0632-4 |access-date=22 December 2022 |language=en |issn=2397-3374|url-access=subscription }}{{cite journal |last1=O'Keefe |first1=Daniel J. |title=How to Handle Opposing Arguments in Persuasive Messages: A Meta-Analytic Review of the Effects of One-Sided and Two-Sided Messages |journal=Annals of the International Communication Association |date=1 January 1999 |volume=22 |issue=1 |pages=209–249 |doi=10.1080/23808985.1999.11678963 |url=https://doi.org/10.1080/23808985.1999.11678963 |access-date=22 December 2022 |issn=2380-8985|url-access=subscription }}{{cite book |last1=World Health Organization. Regional Office for Europe |title=How to respond to vocal vaccine deniers in public: best practice guidance |date=2017 |publisher=World Health Organization. Regional Office for Europe |hdl=10665/343301 |url=https://apps.who.int/iris/handle/10665/343301 |access-date=22 December 2022 |language=en}} Research into science denial raises questions about the societal understanding of science and scientific processes and how to improve science education.{{cite journal |last1=Fackler |first1=Ayça |title=When Science Denial Meets Epistemic Understanding |journal=Science & Education |date=1 June 2021 |volume=30 |issue=3 |pages=445–461 |doi=10.1007/s11191-021-00198-y |pmid=33746364 |pmc=7966612 |bibcode=2021Sc&Ed..30..445F |language=en |issn=1573-1901}}{{cite journal |last1=Golumbic |first1=Yaela N |last2=Motion |first2=Alice |last3=Chau |first3=Amy |last4=Choi |first4=Leo |last5=D'Silva |first5=Dominique |last6=Ho |first6=Jasmine |last7=Nielsen |first7=Mai |last8=Shi |first8=Kevin |last9=Son |first9=Caroline D. |last10=Wu |first10=Olivia |last11=Zhang |first11=Shirley |last12=Zheng |first12=Daisy |last13=Scroggie |first13=Kymberley R |title=Self-reflection promotes learning in citizen science and serves as an effective assessment tool |journal=Computers and Education Open |date=1 December 2022 |volume=3 |pages=100104 |doi=10.1016/j.caeo.2022.100104 |s2cid=251993371 |language=en |issn=2666-5573|doi-access=free }}{{cite journal |last1=Nygren |first1=Thomas |last2=Frau-Meigs |first2=Divina |author-link2=Divina Frau-Meigs |last3=Corbu |first3=Nicoleta |last4=Santoveña-Casal |first4=Sonia |title=Teachers' views on disinformation and media literacy supported by a tool designed for professional fact-checkers: perspectives from France, Romania, Spain and Sweden |journal=SN Social Sciences |date=9 April 2022 |volume=2 |issue=4 |pages=40 |doi=10.1007/s43545-022-00340-9 |pmid=35434642 |pmc=8994523 |language=en |issn=2662-9283}}{{cite journal |last1=Foster |first1=Craig A. |title=[Review] Science Denial: Why It Happens and What to Do About It {{!}} National Center for Science Education |journal=Reports of the National Center for Science Education |date=October 3, 2022 |volume=42 |issue=4 |url=https://ncse.ngo/review-science-denial-why-it-happens-and-what-do-about-it |access-date=22 December 2022 |language=en}}
  • Self-reflection tools – Various cognitive, social and affective factors are involved in beliefs, judgments and decisions. Individual differences in personality traits such as extraversion and the tendency to feel anger, anxiety, stress or depression, and fear are associated with a higher likelihood of sharing rumors online.{{cite journal |last1=Li |first1=Kai |last2=Li |first2=Jie |last3=Zhou |first3=Fen |title=The Effects of Personality Traits on Online Rumor Sharing: The Mediating Role of Fear of COVID-19 |journal=International Journal of Environmental Research and Public Health |date=18 May 2022 |volume=19 |issue=10 |pages=6157 |doi=10.3390/ijerph19106157 |pmid=35627694 |pmc=9140700 |doi-access=free }} Higher levels of agreeableness, conscientiousness and open-mindedness, and lower levels of extraversion are related to greater accuracy when identifying headlines as true or false. People who are more accurate in identifying headlines also report spending less time reading the news each week.{{cite journal |last1=Calvillo |first1=Dustin P. |last2=Garcia |first2=Ryan J.B. |last3=Bertrand |first3=Kiana |last4=Mayers |first4=Tommi A. |title=Personality factors and self-reported political news consumption predict susceptibility to political fake news |journal=Personality and Individual Differences |date=May 2021 |volume=174 |pages=110666 |doi=10.1016/j.paid.2021.110666 |s2cid=233565702 |doi-access=free }} Self-reflection tools that help people to be aware of their possible vulnerability may help them to identify microtargeting directed at individual traits.{{cite journal |last1=Lorenz-Spreen |first1=Philipp |last2=Geers |first2=Michael |last3=Pachur |first3=Thorsten |last4=Hertwig |first4=Ralph |last5=Lewandowsky |first5=Stephan |last6=Herzog |first6=Stefan M. |title=Boosting people's ability to detect microtargeted advertising |journal=Scientific Reports |date=30 July 2021 |volume=11 |issue=1 |pages=15541 |doi=10.1038/s41598-021-94796-z |pmid=34330948 |pmc=8324838 }}
  • Social norms - Disinformation often works to undermine social norms, normalizing and thriving on an atmosphere of confusion, distrust, fear and violence.{{cite news |last1=Cuffley |first1=Adrienne |title=Social Media Misinformation and the Prevention of Political Instability and Mass Atrocities • Stimson Center |url=https://www.stimson.org/2022/social-media-misinformation-and-the-prevention-of-political-instability-and-mass-atrocities/ |access-date=23 December 2022 |work=Stimson Center |date=7 July 2022}}{{cite book |last1=Colomina |first1=Carme |last2=Sanchez Margalef |first2=Héctor |last3=Youngs |first3=Richard |title=The impact of disinformation on democratic processes and human rights in the world |date=2021 |publisher=Directorate General for External Policies of the Union |url=https://www.europarl.europa.eu/RegData/etudes/STUD/2021/653635/EXPO_STU(2021)653635_EN.pdf}}{{cite book |last1=Nemr |first1=Christina |last2=Gangware |first2=William |title=Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age |date=March 28, 2019 |publisher=Park Advisors |url=https://www.state.gov/wp-content/uploads/2019/05/Weapons-of-Mass-Distraction-Foreign-State-Sponsored-Disinformation-in-the-Digital-Age.pdf |access-date=26 January 2023}} In contrast, changing or emphasizing positive social norms is often a focus in programs attempting to improve health and social behaviors.{{cite journal |last1=Edberg |first1=Mark |last2=Krieger |first2=Laurie |title=Recontextualizing the social norms construct as applied to health promotion |journal=SSM - Population Health |date=April 2020 |volume=10 |pages=100560 |doi=10.1016/j.ssmph.2020.100560 |pmid=32140543 |pmc=7047191 }} Social norms may help to reinforce the importance of accurate information, and discourage the sharing and using of false information. Strong social norms can influence members' priorities, expectations, and bonds with one another.{{Cite journal |last1=Chatman |first1=Jennifer A. |last2=Cha |first2=Sandra Eunyoung |date=2003 |title=Leading by Leveraging Culture |url=http://college.emory.edu/faculty/documents/articles/leadingwithculture.pdf |journal=California Management Review |volume=45 |issue=4 |pages=20–34|doi=10.2307/41166186 |jstor=41166186 }} They may encourage the adoption of best practices and higher standards for dealing with disinformation, on the part of the news industry, technology companies, educational institutions, and individuals.{{cite web |last1=West |first1=Darrell M. |title=How to combat fake news and disinformation |url=https://www.brookings.edu/research/how-to-combat-fake-news-and-disinformation/ |website=Brookings |date=18 December 2017}}{{cite news |last1=Rainie |first1=Lee |title=The Future of Truth and Misinformation Online |url=https://www.pewresearch.org/internet/2017/10/19/the-future-of-truth-and-misinformation-online/ |access-date=23 December 2022 |work=Pew Research Center: Internet, Science & Tech |date=19 October 2017}}{{cite journal |last1=Reisach |first1=Ulrike |title=The responsibility of social media in times of societal and political manipulation |journal=European Journal of Operational Research |date=16 June 2021 |volume=291 |issue=3 |pages=906–917 |doi=10.1016/j.ejor.2020.09.020 |pmid=32982027 |pmc=7508050 |language=en |issn=0377-2217}}
  • Warning and fact-checking labels - Online platforms have made intermittent attempts to flag information whose content or source is considered questionable. Warning labels can indicate that a piece of information or a source may be misleading. Fact-checking labels can give the ratings of professional or independent fact-checkers using a rating scale (e.g., as false or altered) or indicating grounds for their rating.{{cite news |last1=Thorbecke |first1=Catherine |title=What to know about Twitter's fact-checking labels |url=https://abcnews.go.com/Business/twitters-fact-checking-labels/story?id=70903715 |access-date=23 December 2022 |work=ABC News |date=May 27, 2020 |language=en}}

See also

References

{{Disinformation}}

*