Internet manipulation
{{Short description|Manipulation of digital technology}}
Internet manipulation is the use of online digital technologies, including algorithms, social bots, and automated scripts, for commercial, social, military, or political purposes.{{Cite book|title=Computational Propaganda: Political Parties, Politicians, and Political Manipulation on Social Media|last1=Woolley|first1=Samuel|last2=Howard|first2=Philip N.|publisher=Oxford University Press|year=2019|isbn=978-0190931414}} Internet and social media manipulation are the prime vehicles for spreading disinformation due to the importance of digital platforms for media consumption and everyday communication.{{Cite journal |last=Diaz Ruiz |first=Carlos |date=2023-10-30 |title=Disinformation on digital media platforms: A market-shaping approach |journal=New Media & Society |language=en |doi=10.1177/14614448231207644 |s2cid=264816011 |issn=1461-4448|doi-access=free }} When employed for political purposes, internet manipulation may be used to steer public opinion,{{Cite web |last1=Marchal |first1=Nahema |last2=Neudert |first2=Lisa-Maria |date=2019 |title=Polarisation and the use of technology in political campaigns and communication |url=https://www.europarl.europa.eu/RegData/etudes/STUD/2019/634414/EPRS_STU(2019)634414_EN.pdf |website=European Parliamentary Research Service}} polarise citizens,{{Cite journal |last1=Kreiss |first1=Daniel |last2=McGregor |first2=Shannon C |date=2023-04-11 |title=A review and provocation: On polarization and platforms |journal=New Media & Society |volume=26 |language=en |pages=556–579 |doi=10.1177/14614448231161880 |s2cid=258125103 |issn=1461-4448|doi-access=free }} circulate conspiracy theories,{{Cite journal |last1=Diaz Ruiz |first1=Carlos |last2=Nilsson |first2=Tomas |date=2023 |title=Disinformation and Echo Chambers: How Disinformation Circulates on Social Media Through Identity-Driven Controversies |url=http://journals.sagepub.com/doi/10.1177/07439156221103852 |journal=Journal of Public Policy & Marketing |language=en |volume=42 |issue=1 |pages=18–35 |doi=10.1177/07439156221103852 |s2cid=248934562 |issn=0743-9156|doi-access=free }} and silence political dissidents. Internet manipulation can also be done for profit, for instance, to harm corporate or political adversaries and improve brand reputation.{{Cite journal |last1=Di Domenico |first1=Giandomenico |last2=Ding |first2=Yu |date=2023-10-23 |title=Between Brand attacks and broader narratives: how direct and indirect misinformation erode consumer trust |journal=Current Opinion in Psychology |volume=54 |pages=101716 |doi=10.1016/j.copsyc.2023.101716 |pmid=37952396 |s2cid=264474368 |issn=2352-250X|doi-access=free }} Internet manipulation is sometimes also used to describe the selective enforcement of Internet censorship{{cite book|last1=Castells|first1=Manuel|title=Networks of Outrage and Hope: Social Movements in the Internet Age|publisher=John Wiley & Sons|isbn=9780745695792|url=https://books.google.com/books?id=ETHOCQAAQBAJ&pg=PT59|access-date=4 February 2017|language=en|date=2015-06-04}}{{cite news|title=Condemnation over Egypt's internet shutdown|url=https://www.ft.com/content/08dbe398-2abb-11e0-a2f3-00144feab49a|newspaper=Financial Times|access-date=4 February 2017}} or selective violations of net neutrality.{{cite web|title=Net neutrality wins in Europe – a victory for the internet as we know it|url=http://www.zmescience.com/ecology/world-problems/net-neutrality-europe-31082016/|publisher=ZME Science|access-date=4 February 2017|date=31 August 2016}}
Internet manipulation for propaganda purposes with the help of data analysis and internet bots in social media is called computational propaganda.
Issues
Internet manipulation often aims to change user perceptions and their corresponding behaviors. Since the early 2000s, this notion of cognitive hacking meant a cyberattack aiming to change human behavior.{{cite conference |last1=Thompson |first1=Paul |year=2004 |title=Cognitive hacking and intelligence and security informatics |url=https://pdfs.semanticscholar.org/cb06/dc28f2fc4ceb5947607846603ed749ef99b5.pdf |url-status=dead|conference= Defense and Security|location=Orlando, Florida, United States |series=Enabling Technologies for Simulation Science VIII |volume=5423 |pages=142–151 |bibcode=2004SPIE.5423..142T |doi=10.1117/12.554454 |s2cid=18907972 |archive-url=https://web.archive.org/web/20170205095753/https://pdfs.semanticscholar.org/cb06/dc28f2fc4ceb5947607846603ed749ef99b5.pdf |archive-date=5 February 2017 |access-date=4 February 2017 |editor-first1=Dawn A. |editor-first2=Alex F. |editor-last1=Trevisani |editor-last2=Sisti}}{{Cite journal |title=Cognitive hacking: a battle for the mind |url=https://ieeexplore.ieee.org/document/1023788 |access-date=2023-11-02 |journal=Computer |date=2002 |language=en-US |doi=10.1109/mc.2002.1023788 |last1=Cybenko |first1=G. |last2=Giani |first2=A. |last3=Thompson |first3=P. |volume=35 |issue=8 |pages=50–56 |url-access=subscription }} Today, fake news, disinformation attacks, and deepfakes can secretly affect behavior in ways that are difficult to detect.{{Cite journal|last=Bastick|first=Zach|date=2021|title=Would you notice if fake news changed your behavior? An experiment on the unconscious effects of disinformation|journal=Computers in Human Behavior|volume=116|issue=106633|page=106633|doi=10.1016/j.chb.2020.106633|doi-access=free}}
It has been found that content that evokes high-arousal emotions (e.g. awe, anger, anxiety or with hidden sexual meaning) is more viral and that content that holds one or many of these elements: surprising, interesting, or useful is taken into consideration.{{cite journal|last1=Berger|first1=Jonah|last2=Milkman|first2=Katherine L|author-link2=Katy Milkman|date=April 2012|title=What Makes Online Content Viral?|url=http://jonahberger.com/wp-content/uploads/2013/02/ViralityB.pdf|journal=Journal of Marketing Research|volume=49|issue=2|pages=192–205|doi=10.1509/jmr.10.0353|s2cid=29504532}}
Providing and perpetuating simple explanations for complex circumstances may be used for online manipulation. Often such are easier to believe, come in advance of any adequate investigations and have a higher virality than any complex, nuanced explanations and information.{{cite web|last1=Hoff|first1=Carsten Klotz von|title=Manipulation 2.0 – Meinungsmache via Facebook|url=https://www.freitag.de/autoren/lordlommel/manipulation-2-0-2013-meinungsmache-via-facebook-und-so|publisher=Der Freitag|access-date=4 February 2017|language=de-DE|date=6 April 2012}} (See also: Low-information rationality)
Prior collective ratings of an web content influences ones own perception of it. In 2015 it was shown that the perceived beauty of a piece of artwork in an online context varies with external influence as confederate ratings were manipulated by opinion and credibility for participants of an experiment who were asked to evaluate a piece of artwork.{{cite book|last1=Golda|first1=Christopher P.|title=Informational Social Influence and the Internet: Manipulation in a Consumptive Society|url=https://books.google.com/books?id=3IX4jwEACAAJ|access-date=4 February 2017|language=en|year=2015}} Furthermore, on Reddit, it has been found that content that initially gets a few down- or upvotes often continues going negative, or vice versa. This is referred to as "bandwagon/snowball voting" by reddit users and administrators.{{cite web|title=Moderators: New subreddit feature – comment scores may be hidden for a defined time period after posting • /r/modnews|url=https://www.reddit.com/r/modnews/comments/1dd0xw/moderators_new_subreddit_feature_comment_scores/|website=reddit|date=29 April 2013 |access-date=4 February 2017|language=en}}
Echo chambers and filter bubbles might be created by Website administrators or moderators locking out people with altering viewpoints or by establishing certain rules or by the typical member viewpoints of online sub/communities or Internet "tribes"
Fake news does not need to be read but has an effect in quantity and emotional effect by its headlines and sound bites alone.{{citation needed|date=February 2017}} Specific points, views, issues and people's apparent prevalence can be amplified, stimulated or simulated. (See also: Mere-exposure effect
Clarifications, conspiracy busting and fake news exposure often come late when the damage is already done and/or do not reach the bulk of the audience of the associated misinformation{{cite web|title=Die Scheinwelt von Facebook & Co. (German-language documentary by the ZDF)|url=https://www.zdf.de/dokumentation/zdfinfo-doku/die-scheinwelt-von-facebook-co-104.html|access-date=4 February 2017|language=de}}{{Better source needed|date=February 2017}}
Social media activities and other data can be used to analyze the personality of people and predict their behaviour and preferences.{{cite web|title=Ich habe nur gezeigt, dass es die Bombe gibt|url=https://www.dasmagazin.ch/2016/12/03/ich-habe-nur-gezeigt-dass-es-die-bombe-gibt/|publisher=Das Magazin|access-date=30 April 2017|date=3 December 2016}}{{cite news|last1=Beuth|first1=Patrick|title=US-Wahl: Big Data allein entscheidet keine Wahl|url=http://www.zeit.de/digital/internet/2016-12/us-wahl-donald-trump-facebook-big-data-cambridge-analytica|newspaper=Die Zeit|access-date=30 April 2017|date=6 December 2016}} Michal Kosinski developed such a procedure. Such can be used for media or information tailored to a person's psyche e.g. via Facebook. According to reports such may have played an integral part in Donald Trump's 2016 election win.{{cite web|title=The Data That Turned the World Upside Down|url=https://www.vice.com/en/article/how-our-likes-helped-trump-win/|publisher=Motherboard|access-date=30 April 2017|language=en-us|date=2017-01-28}} (See also: Targeted advertising, Personalized marketing)
= Algorithms, echo chambers and polarization =
{{Main|Media pluralism}}
Due to overabundance of online content, social networking platforms and search engines have leveraged algorithms to tailor and personalize users' feeds based on their individual preferences. However, algorithms also restrict exposure to different viewpoints and content, leading to the creation of echo chambers or filter bubbles.{{Cite journal|last=Sacasas|first=L. M.|date=2020|title=The Analog City and the Digital City|url=https://www.jstor.org/stable/26898497|journal=The New Atlantis|issue=61|pages=3–18|jstor=26898497|issn=1543-1215}}
With the help of algorithms, filter bubbles influence users' choices and perception of reality by giving the impression that a particular point of view or representation is widely shared. Following the 2016 referendum of membership of the European Union in the United Kingdom and the United States presidential elections, this gained attention as many individuals confessed their surprise at results that seemed very distant from their expectations. The range of pluralism is influenced by the personalized individualization of the services and the way it diminishes choice.{{Cite book|title=World Trends in Freedom of Expression and Media Development Global Report 2017/2018|publisher=UNESCO|year=2018|url=http://www.unesco.org/ulis/cgi-bin/ulis.pl?catno=261065&set=005B2B7D1D_3_314&gp=1&lin=1&ll=1|pages=202}} Five manipulative verbal influences were found in media texts. There are self-expression, semantic speech strategies, persuasive strategies, swipe films and information manipulation. The vocabulary toolkit for speech manipulation includes euphemism, mood vocabulary, situational adjectives, slogans, verbal metaphors, etc.{{Cite journal|last1=Kalinina|first1=Anna V.|last2=Yusupova|first2=Elena E.|last3=Voevoda|first3=Elena V.|date=2019-05-18|title=Means of Influence on Public Opinion in Political Context: Speech Manipulation in the Media|url=https://mediawatchjournal.in/means-of-influence-on-public-opinion-in-political-context-speech-manipulation-in-the-media/|journal=Media Watch|volume=10|issue=2|doi=10.15655/mw/2019/v10i2/49625|s2cid=182112133|issn=2249-8818}}
Research on echo chambers from Flaxman, Goel, and Rao,Flaxman, Seth, Sharad Goel, and Justin M. Rao. 2016. [https://academic.oup.com/poq/article/80/S1/298/2223402 Filter bubbles, echo chambers, and online news consumption]. Public Opinion Quarterly 80 (S1): 298–320. Pariser,Pariser, Eli. 2011. The filter bubble: What the Internet is hiding from you. Penguin UK. Available at https://books.google.co.uk/?hl=en&lr=&oi=fnd&pg=PT3&dq=eli+pariser+filter&ots=g3PrCprRV2&sig=_FI8GISLrm3WNoMKMlqSTJNOFw Accessed 20 May 2017. and Grömping{{Cite journal | doi=10.1177/1326365X14539185|title = Echo Chambers| journal=Asia Pacific Media Educator| volume=24| pages=39–59|year = 2014|last1 = Grömping|first1 = Max|s2cid = 154399136|hdl = 10072/400212|hdl-access = free}} suggest that use of social media and search engines tends to increase ideological distance among individuals.
Comparisons between online and off-line segregation have indicated how segregation tends to be higher in face-to-face interactions with neighbors, co-workers, or family members,Gentzkow, Matthew, and Jesse M. Shapiro. 2011. [https://web.stanford.edu/~gentzkow/research/echo_chambers.pdf Ideological segregation online and offline]. The Quarterly Journal of Economics 126 (4): 1799–1839. and reviews of existing research have indicated how available empirical evidence does not support the most pessimistic views about polarization.Zuiderveen Borgesius, Frederik J., Damian Trilling, Judith Moeller, Balázs Bodó, Claes H. de Vreese, and Natali Helberger. 2016. Should We Worry about Filter Bubbles? Available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2758126. Accessed 20 May 2017 A 2015 study suggested that individuals' own choices drive algorithmic filtering, limiting exposure to a range of content.{{Cite journal |last1=Bakshy |first1=Eytan |last2=Messing |first2=Solomon |last3=Adamic |first3=Lada A. |date=2015-06-05 |title=Exposure to ideologically diverse news and opinion on Facebook |journal=Science |language=en |volume=348 |issue=6239 |pages=1130–1132 |doi=10.1126/science.aaa1160 |pmid=25953820 |bibcode=2015Sci...348.1130B |s2cid=206632821 |issn=0036-8075|doi-access=free }} While algorithms may not be causing polarization, they could amplify it, representing a significant component of the new information landscape.Hargittai. 2015. Why doesn't Science publish important methods info prominently? Crooked Timber. Available at http://crookedtimber.org/2015/05/07/why-doesnt-science-publish-important-methods-info-prominently/. Accessed 20 May 2017.
Research and use by intelligence and military agencies
File:JTRIG report 2011 - slide 1.png
{{See also|Psychological warfare|State-sponsored Internet propaganda|CIA influence on public opinion}}
The Joint Threat Research Intelligence Group unit of the Government Communications Headquarters (GCHQ), the British intelligence agency{{cite news|title=Snowden leaks: GCHQ 'attacked Anonymous' hackers|url=https://www.bbc.co.uk/news/technology-26049448|publisher=BBC|access-date=7 February 2014|date=5 February 2014|work=BBC News}} was revealed as part of the global surveillance disclosures in documents leaked by the former National Security Agency contractor Edward Snowden{{cite web|title=Snowden Docs: British Spies Used Sex and 'Dirty Tricks'|url=https://www.nbcnews.com/news/investigations/snowden-docs-british-spies-used-sex-dirty-tricks-n23091|publisher=NBC News|access-date=7 February 2014|date=7 February 2014}} and its mission scope includes using "dirty tricks" to "destroy, deny, degrade [and] disrupt" enemies.{{cite web|url=https://firstlook.org/theintercept/2014/02/24/jtrig-manipulation/|archive-url=https://archive.today/20140224234346/https://firstlook.org/theintercept/2014/02/24/jtrig-manipulation/|url-status=dead|archive-date=February 24, 2014|title=How Covert Agents Infiltrate the Internet to Manipulate, Deceive, and Destroy Reputations|author=Glenn Greenwald|website=The Intercept|date=2014-02-24}}{{snd}} contains the DISRUPTION Operational Playbook slide presentation by GCHQ Core-tactics include injecting false material onto the Internet in order to destroy the reputation of targets and manipulating online discourse and activism for which methods such as posting material to the Internet and falsely attributing it to someone else, pretending to be a victim of the target individual whose reputation is intended to be destroyed and posting "negative information" on various forums may be used.{{cite web|last1=Greenwald|first1=Glenn|title=How Covert Agents Infiltrate the Internet to Manipulate, Deceive, and Destroy Reputations|url=https://theintercept.com/2014/02/24/jtrig-manipulation/|website=The Intercept|access-date=4 February 2017|date=2014-02-24}}
Known as "Effects" operations, the work of JTRIG had become a "major part" of GCHQ's operations by 2010. The unit's online propaganda efforts (named "Online Covert Action"{{citation needed|date=March 2017}}) utilize "mass messaging" and the "pushing [of] stories" via the medium of Twitter, Flickr, Facebook and YouTube. Online "false flag" operations are also used by JTRIG against targets. JTRIG have also changed photographs on social media sites, as well as emailing and texting colleagues and neighbours with "unsavory information" about the targeted individual. In June 2015, NSA files published by Glenn Greenwald revealed new details about JTRIG's work at covertly manipulating online communities.Greenwald, Glenn and Andrew Fishman. [https://firstlook.org/theintercept/2015/06/22/controversial-gchq-unit-domestic-law-enforcement-propaganda/ Controversial GCHQ Unit Engaged in Domestic Law Enforcement, Online Propaganda, Psychology Research] {{Webarchive|url=https://web.archive.org/web/20150625203557/https://firstlook.org/theintercept/2015/06/22/controversial-gchq-unit-domestic-law-enforcement-propaganda/ |date=2015-06-25 }}. The Intercept. 2015-06-22. The disclosures also revealed the technique of "credential harvesting", in which journalists could be used to disseminate information and identify non-British journalists who, once manipulated, could give information to the intended target of a secret campaign, perhaps providing access during an interview. It is unknown whether the journalists would be aware that they were being manipulated.
Furthermore, Russia is frequently accused of financing "trolls" to post pro-Russian opinions across the Internet. The Internet Research Agency has become known for employing hundreds of Russians to post propaganda online under fake identities in order to create the illusion of massive support.{{cite news|last1=Chen|first1=Adrian|title=The Agency|url=https://www.nytimes.com/2015/06/07/magazine/the-agency.html|newspaper=The New York Times|access-date=30 April 2017|date=2 June 2015}} In 2016 Russia was accused of sophisticated propaganda campaigns to spread fake news with the goal of punishing Democrat Hillary Clinton and helping Republican Donald Trump during the 2016 presidential election as well as undermining faith in American democracy.{{citation|title=Trolls for Trump – How Russia Dominates Your Twitter Feed to Promote Lies (And, Trump, Too)|date=6 August 2016|url=http://www.thedailybeast.com/articles/2016/08/06/how-russia-dominates-your-twitter-feed-to-promote-lies-and-trump-too.html|last2=Weisburd|first2=Andrew |last1=Watts|first1= Clint |newspaper=The Daily Beast|access-date=24 November 2016}}{{citation|url=https://www.pbs.org/newshour/rundown/russian-propaganda-effort-behind-flood-fake-news-preceded-election/|work=PBS NewsHour|access-date=26 November 2016|date=25 November 2016|title=Russian propaganda effort likely behind flood of fake news that preceded election|agency=Associated Press}}{{citation|url=http://www.9news.com.au/world/2016/11/26/08/45/russian-propaganda-campaign-reportedly-spread-fake-news-during-us-election|work=Nine News|agency=Agence France-Presse|access-date=26 November 2016|date=26 November 2016|title=Russian propaganda campaign reportedly spread 'fake news' during US election}}
In a 2017 report{{cite web|title=Information Operations and Facebook|url=https://i2.res.24o.it/pdf2010/Editrice/ILSOLE24ORE/ILSOLE24ORE/Online/_Oggetti_Embedded/Documenti/2017/04/28/facebook-and-information-operations-v1.pdf|access-date=30 April 2017|date=27 April 2017|archive-date=8 January 2022|archive-url=https://web.archive.org/web/20220108122517/https://i2.res.24o.it/pdf2010/Editrice/ILSOLE24ORE/ILSOLE24ORE/Online/_Oggetti_Embedded/Documenti/2017/04/28/facebook-and-information-operations-v1.pdf|via=Il Sole 24 Ore}} Facebook publicly stated that its site has been exploited by governments for the manipulation of public opinion in other countries – including during the presidential elections in the US and France.{{cite news|last1=Solon|first1=Olivia|title=Facebook admits: governments exploited us to spread propaganda|url=https://www.theguardian.com/technology/2017/apr/27/facebook-report-government-propaganda|newspaper=The Guardian|access-date=30 April 2017|date=27 April 2017}}{{cite news|title=Konzern dokumentiert erstmals Probleme: Geheimdienste nutzen Facebook zur Desinformation|url=http://www.spiegel.de/netzwelt/web/facebook-geheimdienste-nutzen-das-soziale-netzwerk-zur-desinformation-a-1145224.html|newspaper=SPIEGEL ONLINE|access-date=30 April 2017|date=2017-04-28|last1=Reinbold|first1=Fabian}}{{cite web|title=Report: Facebook will nicht mehr für Propaganda missbraucht werden|url=https://www.wired.de/collection/tech/facebook-fake-account-falschmeldung-sicherheit-politik-wahlen-fake-news|publisher=WIRED Germany|access-date=30 April 2017|language=de|date=28 April 2017}} It identified three main components involved in an information operations campaign: targeted data collection, content creation and false amplification and includes stealing and exposing information that is not public; spreading stories, false or real, to third parties through fake accounts; and fake accounts being coordinated to manipulate political discussion, such as amplifying some voices while repressing others.{{cite web|title=Facebook targets coordinated campaigns spreading fake news|url=https://www.cnet.com/news/facebook-targets-coordinated-campaigns-spreading-fake-news/|website=CNET|access-date=30 April 2017|language=en}}{{cite web|title=Facebook, for the first time, acknowledges election manipulation|website=CBS News|date=28 April 2017 |url=http://www.cbsnews.com/news/facebook-for-the-first-time-acknowledges-election-manipulation/|access-date=30 April 2017|language=en}}
In politics
{{See also|Russian interference in the 2016 United States elections}}
In 2016 Andrés Sepúlveda disclosed that he manipulated public opinion to rig elections in Latin America. According to him with a budget of $600,000 he led a team of hackers that stole campaign strategies, manipulated social media to create false waves of enthusiasm and derision, and installed spyware in opposition offices to help Enrique Peña Nieto, a right-of-center candidate, win the election.{{cite news|title=How to Hack an Election|newspaper=Bloomberg.com |url=https://www.bloomberg.com/features/2016-how-to-hack-an-election/|publisher=Bloomberg|access-date=22 January 2017}}{{cite news|title=Man claims he rigged elections in most Latin American countries over 8 years|url=https://www.independent.co.uk/news/world/americas/political-cyberhacker-andres-sepulveda-reveals-how-he-digitally-rigged-elections-across-latin-a6965161.html|newspaper=The Independent|access-date=22 January 2017|date=2 April 2016}}
In the run up to India's 2014 elections, both the Bharatiya Janata party (BJP) and the Congress party were accused of hiring "political trolls" to talk favourably about them on blogs and social media.{{cite news|last1=Shearlaw|first1=Maeve|title=From Britain to Beijing: how governments manipulate the internet|url=https://www.theguardian.com/world/2015/apr/02/russia-troll-factory-kremlin-cyber-army-comparisons|newspaper=The Guardian|access-date=4 February 2017|date=2 April 2015}}
The Chinese government is also believed to run a so-called "50-cent army" (a reference to how much they are said to be paid) and the "Internet Water Army" to reinforce favourable opinion towards it and the Chinese Communist Party (CCP) as well as to suppress dissent.{{cite book|last1=MacKinnon|first1=Rebecca|title=Consent of the networked: the world-wide struggle for Internet freedom|date=2012|publisher=Basic Books|location=New York|isbn=978-0-465-02442-1|title-link=Consent of the Networked}}
In December 2014 the Ukrainian information ministry was launched to counter Russian propaganda with one of its first tasks being the creation of social media accounts (also known as the i-Army) and amassing friends posing as residents of eastern Ukraine.{{cite web|title=Ukraine's new online army in media war with Russia|url=https://www.bbc.co.uk/monitoring/ukraines-new-online-army-in-media-war-with-russia|publisher=BBC|access-date=4 February 2017}}
Twitter suspended a number of bot accounts that appeared to be spreading pro-Saudi Arabian tweets about the disappearance of Saudi dissident journalist Jamal Khashoggi.{{cite news |title=Twitter pulls down bot network that pushed pro-Saudi talking points about disappeared journalist |url=https://www.nbcnews.com/tech/tech-news/exclusive-twitter-pulls-down-bot-network-pushing-pro-saudi-talking-n921871 |work=NBC News |date=19 October 2018}}
A report by Mediapart claimed that the UAE, through a secret services agent named Mohammed, was using a Switzerland-based firm Alp Services to run manipulation campaigns against Emirati opponents. Alp Services head, Mario Brero used fictitious accounts that were publishing fake articles under pseudonyms to attack Qatar and the Muslim Brotherhood networks in Europe. The UAE assigned Alp to publish at least 100 articles per year that were critical of Qatar.{{cite web|url=https://www.mediapart.fr/en/journal/france/040323/leaked-data-shows-extent-uaes-meddling-france|title=Leaked data shows extent of UAE's meddling in France|access-date=4 March 2023|website=MediaPart|date=4 March 2023 }}
In business and marketing
Internet manipulation may be seen being used within business and marketing as a way to influence consumers. [https://www.forbes.com/councils/forbestechcouncil/2020/10/01/how-social-media-manipulation-threatens-your-business---and-what-you-can-do-about-it/ Forbes], discusses how disinformation online can be spread rapidly across social platforms. Stu Sjouwerman suggests that legitimate and fake news are blurring together and both are being weaponized on large-scale campaigns to influence the general population.{{Cite web |last=Sjouwerman |first=Stu |title=Council Post: How Social Media Manipulation Threatens Your Business — And What You Can Do About It |url=https://www.forbes.com/councils/forbestechcouncil/2020/10/01/how-social-media-manipulation-threatens-your-business---and-what-you-can-do-about-it/ |access-date=2024-12-11 |website=Forbes |language=en}}
Trolling and other applications
Hackers, hired professionals and private citizens have all been reported to engage in internet manipulation using software, including Internet bots such as social bots, votebots and clickbots.{{Cite journal |last1=Gorwa |first1=Robert |last2=Guilbeault |first2=Douglas |date=2018-08-10 |title=Unpacking the Social Media Bot: A Typology to Guide Research and Policy: Unpacking the Social Media Bot |journal=Policy & Internet |language=en |arxiv=1801.06863 |doi=10.1002/poi3.184 |s2cid=51877148}} In April 2009, Internet trolls of 4chan voted Christopher Poole, founder of the site, as the world's most influential person of 2008 with 16,794,368 votes by an open Internet poll conducted by Time magazine.{{cite magazine|url=http://www.time.com/time/arts/article/0,8599,1894028,00.html |archive-url=https://web.archive.org/web/20090428114850/http://www.time.com/time/arts/article/0,8599,1894028,00.html |url-status=dead |archive-date=April 28, 2009 |title=The World's Most Influential Person Is... |magazine=TIME |date=April 27, 2009 |access-date=September 2, 2009}} The results were questioned even before the poll completed, as automated voting programs and manual ballot stuffing were used to influence the vote.{{cite magazine|url=https://www.pcmag.com/article2/0,2817,2345987,00.asp |title=4Chan Followers Hack Time's 'Influential' Poll |last=Heater |first=Brian |magazine=PC Magazine |date=April 27, 2009 |access-date=April 27, 2009 |archive-url=https://web.archive.org/web/20090430151912/http://www.pcmag.com/article2/0%2C2817%2C2345987%2C00.asp |archive-date=April 30, 2009 |url-status=live }}{{cite news | url = https://www.washingtonpost.com/wp-dyn/content/article/2009/04/21/AR2009042101864.html | last = Schonfeld | first = Erick | title = 4Chan Takes Over The Time 100 |newspaper=Washington Post | date = April 21, 2009 | access-date =April 27, 2009 }}{{cite web|url=http://musicmachinery.com/2009/04/27/moot-wins-time-inc-loses/ |title=moot wins, Time Inc. loses « Music Machinery |publisher=Musicmachinery.com |date=April 27, 2009 |access-date=September 2, 2009| archive-url= https://web.archive.org/web/20090503031919/http://musicmachinery.com/2009/04/27/moot-wins-time-inc-loses/| archive-date=May 3, 2009| url-status= live}} 4chan's interference with the vote seemed increasingly likely, when it was found that reading the first letter of the first 21 candidates in the poll spelled out a phrase containing two 4chan memes: "Marblecake. Also, The Game".{{cite web|author =Reddit Top Links |url=https://www.buzzfeed.com/reddit/also-the-work-of-4chan-pic |archive-url=https://web.archive.org/web/20090415164347/http://www.buzzfeed.com/reddit/also-the-work-of-4chan-pic |archive-date=April 15, 2009 |title=Marble Cake Also the Game [PIC] |publisher=Buzzfeed.com |access-date=September 2, 2009}}
Jokesters and politically oriented hacktivists may share sophisticated knowledge of how to manipulate the Web and social media.{{cite news|last1=Maslin|first1=Janet|title='We Are Anonymous' by Parmy Olson|url=https://www.nytimes.com/2012/06/01/books/we-are-anonymous-by-parmy-olson.html|newspaper=The New York Times|access-date=4 February 2017|date=31 May 2012}}
Countermeasures
{{See also|Fact checking|Web literacy}}
In Wired it was noted that nation-state rules such as compulsory registration and threats of punishment are not adequate measures to combat the problem of online bots.{{cite web|title=Debatte um "Social Bots": Blinder Aktionismus gegen die eigene Hilflosigkeit|url=https://www.wired.de/collection/tech/debatte-um-social-bots-blinder-aktionismus-gegen-die-eigene-hilflosigkeit|publisher=WIRED Germany|access-date=4 February 2017|language=de|date=23 January 2017}}
To guard against the issue of prior ratings influencing perception several websites such as Reddit have taken steps such as hiding the vote-count for a specified time.
Some other potential measures under discussion are flagging posts for being likely satire or false.{{cite web|title=How technology is changing the way we think – Daniel Suarez, Jan Kalbitzer & Frank Rieger|website = YouTube|url=https://www.youtube.com/watch?v=rTx78aaZ6w0|access-date=30 April 2017|date=7 December 2016}} For instance in December 2016 Facebook announced that disputed articles will be marked with the help of users and outside fact checkers.{{cite news|last1=Jamieson|first1=Amber|last2=Solon|first2=Olivia|title=Facebook to begin flagging fake news in response to mounting criticism|url=https://www.theguardian.com/technology/2016/dec/15/facebook-flag-fake-news-fact-check|newspaper=The Guardian|access-date=4 February 2017|date=15 December 2016}} The company seeks ways to identify 'information operations' and fake accounts and suspended 30,000 accounts before the presidential election in France in a strike against information operations.
Inventor of the World Wide Web Tim Berners-Lee considers putting few companies in charge of deciding what is or is not true a risky proposition and states that openness can make the web more truthful. As an example he points to Wikipedia which, while not being perfect, allows anyone to edit with the key to its success being not just the technology but also the governance of the site. Namely, it has an army of countless volunteers and ways of determining what is or is not true.{{cite magazine|last1=Finley|first1=Klint|title=Tim Berners-Lee, Inventor of the Web, Plots a Radical Overhaul of His Creation|magazine=Wired|url=https://www.wired.com/2017/04/tim-berners-lee-inventor-web-plots-radical-overhaul-creation/|access-date=4 April 2017|date=2017-04-04}}
Furthermore, various kinds of software may be used to combat this problem such as fake checking software or voluntary browser extensions that store every website one reads or use the browsing history to deliver fake revelations to those who read a fake story after some kind of consensus was found on the falsehood of a story.{{Original research inline|date=February 2017}}
Furthermore, Daniel Suarez asks society to value critical analytic thinking and suggests education reforms such as the introduction of 'formal logic' as a discipline in schools and training in media literacy and objective evaluation.
= Government responses =
According to a study of the Oxford Internet Institute, at least 43 countries around the globe have proposed or implemented regulations specifically designed to tackle different aspects of influence campaigns, including fake news, social media abuse, and election interference.{{Cite journal|last1=Bradshaw|first1=Samantha|last2=Neudert|first2=Lisa-Maria|last3=Howard|first3=Philip N.|date=2018|title=Government Responses to Malicious Use of Social Media|url=https://www.stratcomcoe.org/government-responses-malicious-use-social-media|journal=Nato Stratcom Coe|isbn=978-9934-564-31-4|via=20}}
== Germany ==
In Germany, during the period preceding the elections in September 2017, all major political parties save AfD publicly announced that they would not use social bots in their campaigns. Additionally, they committed to strongly condemning such usage of online bots.
Moves towards regulation on social media have been made: three German states Hessen, Bavaria, and Saxony-Anhalt proposed in early 2017 a law that would mean social media users could face prosecution if they violate the terms and conditions of a platform. For example, the use of a pseudonym on Facebook, or the creation of fake account, would be punishable by up to one year's imprisonment.{{cite web |last1=Reuter |first1=Markus |title=Hausfriedensbruch 4.0: Zutritt für Fake News und Bots strengstens verboten |url=https://netzpolitik.org/2017/hausfriedensbruch-4-0-zutritt-fuer-fake-news-und-bots-strengstens-verboten/ |website=Netzpolitik |date=17 January 2017 |access-date=24 October 2019}}
== Italy ==
In early 2018, the Italian Communications Agency AGCOM published a set of guidelines on its website, targeting the elections in March that same year. The six main topics are:{{cite web |last1=Bellezza |first1=Marco |last2=Frigerio |first2=Filippo Frigerio |title=ITALY: First Attempt to (Self)Regulate the Online Political Propaganda |date=6 February 2018 |url=http://www.medialaws.eu/italy-first-attempt-to-selfregulate-the-online-political-propaganda/}}
- Political Subjects's Equal Treatment
- Political Propaganda's Transparency
- Contents Illicit and Activities Whose Dissemination Is Forbidden (i.e. Polls)
- Social Media Accounts of Public Administrations
- Political Propaganda is Forbidden on Election Day and Day Before
- Recommendations for stronger fact-checking services
== France ==
In November 2018, a law against the manipulation of information was passed in France. The law stipulates that during campaign periods:{{cite web |title=Against information manipulation |url=https://www.gouvernement.fr/en/against-information-manipulation |website=Gouvernement.fr |access-date=24 October 2019}}
- Digital platforms must disclose the amount paid for ads and the names of their authors. Past a certain traffic threshold, platforms are required to have a representative present in France, and must publish the algorithms used.
- An interim judge may pass a legal injunction to halt the spread of fake news swiftly. 'Fake news' must satisfy the following: (a)it must be manifest; (b) it must be disseminated on a massive scale; and (c) lead to a disturbance of the peace or compromise the outcome of an election.
== Malaysia ==
In April 2018, the Malaysian parliament passed the Anti-Fake News Act. It defined fake news as 'news, information, data and reports which is or are wholly or partly false.'{{cite news |last1=Menon |first1=Praveen |title=Malaysia outlaws 'fake news'; sets jail of up to six years |url=https://www.reuters.com/article/us-malaysia-election-fakenews/malaysia-outlaws-fake-news-sets-jail-of-up-to-six-years-idUSKCN1H90Y9 |website=Reuters |date=2 April 2018 |access-date=24 October 2019}} This applied to citizens or those working at a digital publication, and imprisonment of up to 6 years was possible. However, the law was repealed after heavy criticism in August 2018.{{cite web |last1=Yeung |first1=Jessie |title=Malaysia repeals controversial fake news law |url=https://edition.cnn.com/2018/08/17/asia/malaysia-fake-news-law-repeal-intl/index.html |website=CNN |date=17 August 2018 |access-date=24 October 2019}}
== Kenya ==
In May 2018, President Uhuru Kenyatta signed into law the Computer and Cybercrimes bill, that criminalised cybercrimes including cyberbullying and cyberespionage. If a person "intentionally publishes false, misleading or fictitious data or misinforms with intent that the data shall be considered or acted upon as authentic," they are subject to fines and up to two years imprisonment.{{cite web |last1=Schwartz |first1=Arielle |title=Kenya signs bill criminalising fake news |url=https://mg.co.za/article/2018-05-16-kenya-signs-bill-criminalising-fake-news |website=Mail & Guardian |date=16 May 2018 |access-date=24 October 2019}}
=Research=
German chancellor Angela Merkel has issued the Bundestag to deal with the possibilities of political manipulation by social bots or fake news.{{cite news|title=Bundestagsdebatte: Merkel schimpft über Internet-Trolle|url=http://www.sueddeutsche.de/politik/bundestagsdebatte-merkel-schimpft-ueber-internet-trolle-1.3262752|publisher=Süddeutsche Zeitung|access-date=4 February 2017|language=de|date=1 November 2016|newspaper=Sueddeutsche.de}}
See also
Sources
{{Free-content attribution
|title=World Trends in Freedom of Expression and Media Development Global Report 2017/2018|author=University of Oxford|publisher=UNESCO|page numbers=202|source=|documentURL=http://unesdoc.unesco.org/images/0026/002610/261065e.pdf|license statement URL=http://www.unesco.org/ulis/cgi-bin/ulis.pl?catno=261065&set=005B2A1A4F_2_135&gp=1&lin=1&ll=1|license=CC BY SA 3.0 IGO}}
References
{{reflist|30em}}
External links
- [https://www.youtube.com/watch?v=rTx78aaZ6w0 How technology is changing the way we think], Daniel Suarez talk on YouTube
- [https://www.youtube.com/watch?v=Gyl9WIjh2B8 How "Bots" Control Your Life], Daniel Suarez talk on YouTube
- {{cite web|title=The new power of manipulation|date=18 October 2016|url=http://www.dw.com/en/the-new-power-of-manipulation/a-36074925|publisher=Deutsche Welle|language=en}}
{{Disinformation}}
{{media manipulation}}