Summary and context
Hoaxlines found accounts that had pushed false claims about the 2009 Swine Flu and the vaccine against it had also promoted Russian state media and pro-Kremlin narratives and disinformation years later.1
Although these accounts’ behavior and traits suggested inauthenticity and some exclusively spread false information, some remain active on the platform today [5–9]. Some users fell dormant, active only between 2009 and 2010, but others, like the example in this report, spread content and claims likely intended to influence real users [6, 10–14].
We encountered the focus of this case study while collecting data for a retrospective longitudinal study. The example user—data from others will be released in a subsequent report— posted content, stances, and it behaved similar to accounts found in past, confirmed information operations [15–19]. Hoaxlines found the case study account had in the past 24 to 48 hours (April 7 to 9):
- Attempted to deflect blame from Russia for the brutal slaughter of Ukrainian civilians in Bucha, Ukraine .
- Shared an Iranian state-media article while claiming the United States is responsible for the deaths of Ukrainians because it has supplied weapons to Ukraine .
- Shared Russian state-controlled content that claimed that Ukrainian soldiers were using civilians as human shields and accused Ukraine of taking food away from people in Donbas. The accusation is a particularly dark assertion given Russia’s history of killing millions of Ukrainians via intentional starvation [22,23].
- Accused Jewish people of instigating World War 3 and asserted that Zelensky is responsible for the deaths of any Ukrainian male because he “ordered” them “to become combatants.”
- Posted “This is a war, a US war against Russia. Why? Because the US is NATO. What is NATO? NATO is a Horribles Parade costume, it’s a Halloween mask,” with an article from Iranian-state media.
- Claimed that new variants were endangering Americans because the US government gave (sic) “Billions to a bunch of Neo-Nazi's and a Zionist Puppet in Ukraine.”
The cost of a failure to act
Had the account been removed in 2010 when it promoted conspiracy theories about the Swine flu and the vaccine to prevent it, it would not be on the platform to help the Kremlin sow confusion and avoid accountability for war crimes [24–26]. EUvsDisinfo wrote on April 7 :
The Kremlin is trying to occupy the information space by flooding it with contradicting “explanations” of the events. The goal is not only to deflect the blame for this particular atrocity against peaceful civilians but also to pre-emptively shape narratives for countering and discrediting any evidence or investigation into Russian war crimes in Ukraine.
Hoaxlines has previously reported that suspensions do not appear to and perhaps cannot change the behavior of accounts [27,28]. Disciplinary action cannot cause a genuine user to materialize. Thus this response has little chance of being effective with type of account.
Examples of content
Here are examples of content from the case study subject that made it identifiable as an inauthentic account over a decade ago:
- Swine flu (H1N1) and the vaccine
- Bill Gates conspiracy theories
- Dengue fever
- Julian Assange
- Syria and chemical weapons use
- Monsanto conspiracies
- Ebola virus
- The downing of MH17
- Invasion of Crimea
- John McCain and ISIS
- Everything is George Soros’ fault
- Hilary Clinton’s emails
- Zika virus and the vaccine
- COVID origin
- Sputnik V (Russian COVID vaccine) is the best
- NATO is out to get Russia
- Bioweapons in Ukraine (2022)
- Anti-semitism and Azov
- Bucha Massacre denial
1. Watts C. Russia’s Disinformation Ecosystem. In: Selected Wisdom [Internet]. 2021 [cited 10 Dec 2021]. Available: https://clintwatts.substack.com/p/russias-disinformation-ecosystem?s=r
2. Watts C. Russia’s Propaganda & Disinformation Ecosystem. In: Selected Wisdom [Internet]. 2022 [cited 10 Dec 2021]. Available: https://clintwatts.substack.com/p/russias-propaganda-and-disinformation?s=r
3. United States Government. Treasury escalates sanctions against the Russian Government’s attempts to influence U.s. elections. In: U.S. Department of the Treasury [Internet]. 2021 [cited 10 Apr 2022]. Available: https://home.treasury.gov/news/press-releases/jy0126
4. Merchant N. US accuses financial website of spreading Russian propaganda. In: Associated Press [Internet]. 15 Feb 2022 [cited 10 Apr 2022]. Available: https://apnews.com/article/russia-ukraine-coronavirus-pandemic-health-moscow-media-ff4a56b7b08bcdc6adaf02313a85edd9
5. Broniatowski DA, Jamison AM, Qi S, AlKulaib L, Chen T, Benton A, et al. Weaponized Health Communication: Twitter Bots and Russian Trolls Amplify the Vaccine Debate. Am J Public Health. 2018;108: 1378–1384.
6. Prier J. Social Media as Information Warfare. Strategic Studies Quarterly. 2017;11: 50–85.
7. Sheridan K. Spot the bot: Researchers open-source tools to hunt Twitter bots. In: Dark Reading [Internet]. 6 Aug 2018 [cited 7 Oct 2021]. Available: https://www.darkreading.com/threat-intelligence/spot-the-bot-researchers-open-source-tools-to-hunt-twitter-bots
8. Dotto C. How to spot a bot (or not): The main indicators of online automation, coordination, and inauthentic activity. 28 Nov 2019 [cited 7 Oct 2021]. Available: https://firstdraftnews.org/articles/how-to-spot-a-bot-or-not-the-main-indicators-of-online-automation-co-ordination-and-inauthentic-activity/
9. Nimmo B. #BotSpot: Twelve ways to spot a bot. In: DFR Lab [Internet]. 28 Aug 2017 [cited 7 Oct 2021]. Available: https://medium.com/dfrlab/botspot-twelve-ways-to-spot-a-bot-aedc7d9c110c
10. Gamberini SJ. Social Media Weaponization: The Biohazard of Russian Disinformation Campaigns. Joint Force Quarterly. 2020;5: 99.
11. Weinberg D, Dawson J. Military Narratives and Profiles in Russian Influence Operations on Twitter. 2021. doi:10.31235/osf.io/b9a2m
12. Shao C, Ciampaglia GL, Varol O, Yang K-C, Flammini A, Menczer F. The spread of low-credibility content by social bots. Nat Commun. 2018;9: 4787.
13. Galeano K, Galeano R, Al-Khateeb S, Agarwal N. Studying the Weaponization of Social Media: Case Studies of Anti-NATO Disinformation Campaigns. Lecture Notes in Social Networks. 2020. pp. 29–51. doi:10.1007/978-3-030-41251-7_2
14. Onyango E. Kenyan influencers paid to take “guerrilla warfare” online. BBC. 12 Sep 2021. Available: https://www.bbc.com/news/world-africa-58474936. Accessed 15 Feb 2022.
15. Information Operations Archive. In: Information Operations Archive [Internet]. 2019 [cited 10 Apr 2022]. Available: https://www.io-archive.org/#/
16. Anise O, Wright J. Anatomy of Twitter Bots: Amplification Bots. In: Duo Labs [Internet]. 2018 [cited 6 Dec 2021]. Available: https://duo.com/labs/research/anatomy-of-twitter-bots-amplification-bots
17. Wild J, Godart C. Spotting bots, cyborgs and inauthentic activity. In: DataJournalism.com [Internet]. 13 Mar 2020 [cited 10 Apr 2022]. Available: https://datajournalism.com/read/handbook/verification-3/investigating-actors-content/3-spotting-bots-cyborgs-and-inauthentic-activity
18. Looft C. Anatomy of an Anti-Muslim Influence Operation. In: First Draft [Internet]. 4 Jun 2021 [cited 10 Apr 2022]. Available: https://firstdraftnews.org/articles/anti-muslim-disinformation-israel-palestine/
19. Pavliuc A. DisInfoVis: How to Understand Networks of Disinformation Through Visualization. In: Towards Data Science [Internet]. 25 Sep 2020 [cited 20 Sep 2021]. Available: https://towardsdatascience.com/disinfovis-how-to-understand-networks-of-disinformation-through-visualization-b4cb0afa0a71
20. EUvsDISINFO. Disinformation to Conceal War Crimes: Russia is Lying About Atrocities in Bucha. In: EU vs DISINFORMATION [Internet]. 7 Apr 2022 [cited 10 Apr 2022]. Available: https://euvsdisinfo.eu/disinformation-to-conceal-war-crimes-russia-is-lying-about-atrocities-in-bucha/
21. Watts C. Iran’s Disinformation Ecosystem: A Snapshot. In: Selected Wisdom [Internet]. 2021 [cited 2022]. Available: https://clintwatts.substack.com/p/irans-disinformation-ecosystem-a?s=r
22. Rudnytskyi O, Levchuk N, Wolowyna O, Shevchuk P, Kovbasiuk A. Demography of a Man-Made Human Catastrophe: the Case of Massive Famine in Ukraine 1932–1933. MAPA Digital Atlas of Ukraine Project Ukrainian Research Institute. 2015 [cited 10 Apr 2022]. Available: https://gis.huri.harvard.edu/demography-man-made-human-catastrophe
23. Sebryn R. Holodomor: The Ukrainian Genocide. PISM Series. 2010; 205–230.
24. Seitz A, Lajka A. Amid horror in Bucha, Russia relies on propaganda and disinformation. In: PBS NewsHour [Internet]. 6 Apr 2022 [cited 10 Apr 2022]. Available: https://www.pbs.org/newshour/world/amid-horror-in-bucha-russia-relies-on-propaganda-and-disinformation
25. Li E. Russia Likely behind Viral Twitter Thread Claiming Ukrainians Torturing Each Other; Thread Shared By Western Far Left and Far Right Figures. In: Global Influence Operations Report [Internet]. 2022 [cited 25 Mar 2022]. Available: https://www.global-influence-ops.com/russia-likely-behind-viral-twitter-thread-claiming-ukrainians-torturing-each-other-thread-shared-by-western-far-left-and-far-right-figures/
26. Abbruzzese J, Collins B. Russian disinformation, propaganda ramp up as conflict in Ukraine grows. In: NBC News [Internet]. 24 Feb 2022 [cited 10 Apr 2022]. Available: https://www.nbcnews.com/tech/internet/russian-disinformation-propaganda-ramp-conflict-ukraine-grows-rcna17521
27. conspirator0. A website where you can buy fake Twitter followers. In: Twitter [Internet]. 27 Mar 2022 [cited 9 Apr 2022]. Available:
Oh look, it's a website where you can buy Twitter accounts with various numbers of followers. (Tip: doing so is almost certainly unwise.) cc: @ZellaQuixote
28. NovelSci. A finding of interest was that accounts that boosted claims about Ukrainians being Nazis also denied the Bucha Massacre. This demonstrates the cost of failing to remove these accounts. They do not stop and do not alter behavior after temporary bans. They continue until banned.
A finding of interest was that accounts that boosted claims about Ukrainians being Nazis also denied the Bucha Massacre. This demonstrates the cost of failing to remove these accounts. They do not stop and do not alter behavior after temporary bans. They continue until banned.
Cover photo sources from the case study subject’s tweet.
Pro-Kremlin is defined as narratives or claims promoted by or that benefit the Kremlin. Promotion may be through but is not limited to overt propaganda and disinformation outlets, government officials, proxies, or online information operations [1–4].