Earlier this year, The Strategy Bridge asked civilian and military students around the world to participate in our seventh annual student writing contest on the subject of strategy.
Now, we are pleased to present an essay selected for Honorable Mention from Sakeena Siddiqi, a recent graduate of the Eisenhower School.
China’s influence operations have evolved to employ “information laundering” to shape global narratives. Information laundering—the process of introducing disinformation into the Internet ecosystem and legitimizing it through transitions from fringe sites to public discourse—is the next generation of information operations. While Russia’s 2016 election interference is a well-documented and heavily litigated example of this process, China’s social media forays have been characterized as more overt and less skillful.[1] Their actions have garnered minimal engagements, were easily attributed, and were ultimately blocked as state-sponsored content.[2] However, China’s recent history with influence operations demonstrates a continually evolving tactic and narrative, with information laundering as the most recent and successful approach. “Borrowing a boat out to sea” translates to a Chinese term for the government’s approach to using foreign media to deliver its message.[3]
What It Is
Information laundering follows a three-step process.[4]
Placement constitutes the development of a counterfeit narrative.
Layering progresses the narrative through multiple platforms, using enablers, amplifiers (e.g., influencers), and accelerators (e.g., echo chambers) to help the narrative to go viral while hiding the original source.
Integration introduces this narrative into mainstream discourse via traditional and new media.
The process benefits the originator by obscuring their identity while influencing perceptions, gaining followers, and normalizing the positions of the originator.[5] The narrative washes through four types of enablers: “discovery (search engines), information (news and research), opinion (blogs and discussion forums), and expression (social networks, gaming, and online shopping).”[6] Borrowing their credibility results in the counterfeit narrative appearing factual.[7]
Information laundering differs from cloaking, whereby a counterfeit narrative attempts to gain legitimacy by appearing as a reputable source, using techniques such as residing on a benignly named website, using a “.org” URL, or mimicking the appearance of a legitimate site.[8]
Pace University Associate Professor of Communication and Media Studies Adam Klein pioneered the concept of information laundering.[9] Samantha Korta, a researcher from the Naval Postgraduate School, expanded Klein’s model to encompass all counterfeit narratives, including disinformation (deliberately false information) and malinformation (true information deliberately contorted by context and presentation to be misleading).
Humans and the Internet: WhY Information Laundering Works
Characteristics of the internet, human behavior, and fake news foster information laundering’s success. First, the internet ecosystem enables fake news to spread across real networks. Information can now cross platforms and access millions—significantly more than traditional media sources like print newspapers or radio broadcasts.[11] Social media algorithms incentivize article proliferation, resulting in additional engagements without validating accuracy.[12] Additionally, mainstream culture is no longer limited to traditional news media and has expanded to include “other news, pseudo-news, and flat-out false news sites” through enablers.[13] Freedom House notes that “[t]he public use of smartphones to document events in real time turned ordinary internet users into citizen journalists.”[14]
Second, humans gravitate towards—and accept as fact—information they are predisposed to believe. Echo chambers, a natural outgrowth of social interaction, can thus be exploited “by cognitive factors such as confirmation bias, implicit egotism, and false consensus effect.”[15] Further, the third-person effect theorizes that people overestimate others’ susceptibility to fake news while underestimating their own, leading to a “my bubble is safer than yours” view.[16] While aware that fake news abounds, not only are people receptive to confirmatory data, they are also prone to reject disaffirming data regardless of veracity.
Third, counterfeit narratives lend themselves to proliferation, with one theory being that “false news is more novel and that more novel information is more likely to be retweeted.”[17] A Massachusetts Institute of Technology (MIT) study of tweets from 2006 to 2017 confirmed that false news, with politics as the largest rumor category, “diffused farther, faster, deeper, and more broadly than truth in all categories of information,” noting it was 70 percent more likely to be retweeted.[18] Notably, their findings were unchanged when removing bots from the analysis, indicating falsehoods spread via human “peer-to-peer diffusion.”[19] The study used multiple fact-checking organizations to label news as true or false. The difficulty in this assessment arguably also benefits counterfeit narratives as they cannot immediately be discounted.
China’s Use of Influence Operations
China’s history with influence operations in social media traces strategic pivots in narratives, audiences, and approaches. China’s 2011 Military Dictionary described public opinion warfare as “creating a favorable public opinion environment for political initiative and military victory,” through the “comprehensive use of various media means and information resources to fight the enemy.”[20] From their perspective, the extensive U.S. engagement in Internet infrastructure, service providers, consumption, and de facto use of the English language constitutes a threat in public opinion warfare which justifies defensive actions.[21] Public opinion warfare continues in times of both peace and war, with peacetime operations aimed at “long-term infiltration into the objects of the society’s and culture’s deep structure, changing the awareness and conviction of the enemy masses.”[22]
Disinformation operations are a long-standing practice for China, with the People’s Liberation Army (PLA) only one of many agencies engaged in propaganda efforts.[23] PLA social media experts train in political warfare with goals of improving PLA image and correcting misperceptions.[24] China also operates two pillars of Chinese propaganda: the Central Propaganda Department and the United Front.[25]
The earliest pro-China influence operations using social media date to 2017, with subsequent propaganda networks uncovered by Graphika in 2019, 2020, and 2021, and by the Australian Strategic Public Policy Institute (ASPI) in 2020, Stanford Internet Observatory in 2020, and Bellingcat in 2020.[26] In June 2018, President Xi Jinping encouraged “telling China’s story well and molding a positive national image” while “actively using overseas social media platforms.”[27]
Graphika termed the networks “Spamouflage Dragon,” as spam-producing accounts also published political messages blaming then-presidential candidate Donald Trump for racial divisions.[28] Graphika could not, however, attribute the postings to the Chinese Communist Party (CCP), and the effectiveness of the campaign was stymied by lack of engagement and being blocked by social media companies.[29] Additionally, the audience was likely Chinese Communist Party officials, with kudos being earned by members for a high number of postings rather than for actual outside engagements.[30]
Marking a shift in audience, Buzzfeed News reported in March 2019 on the first indications of Chinese persona accounts or trolls on Western social media.[31] Actual attribution occurred in August 2019, when Twitter removed China’s state-sponsored tweets regarding the Hong Kong protests.[32] These tweets proved Chinese social media influence operations were aimed, for the first time, at Western audiences.[33] They also indicated China’s study of Russian and Iranian techniques: “using high-volume bot accounts, co-opting spam infrastructure (Twitter clients) to spread political messages, and amplifying controversial content.”[34]
More importantly, these also hinted at a narrative shift as China’s influence operations expanded from promoting the so-called China Story to tackling U.S. concerns in the first half of 2020.[35] In March 2020, China operated a combination of falsely generated accounts and repurposed accounts, indicating coordinated, systematic postings to alter international COVID messaging and deflect criticism.[36] Similarly, in June 2021, ProPublica and the New York Times reported on Uyghur content videos attempting to directly rebut and discredit a speech from Secretary of State Mike Pompeo critical of the Chinese Communist Party speech.[37] The report proved that China was generating counterfeit narrative content in direct response to Western reports.
In May 2021, Xi Jinping proposed that China “expand [the CCP’s] circle of friends in international public opinion,” pushing for use of influencers to amplify China’s story line.[38] Information laundering, as described in the following two case studies, is therefore the next evolution of the preceding efforts for embedding disinformation and malinformation. The studies contrast in the identification and attribution of counterfeit narratives, thereby revealing different ways the information model applies in practice.
Uyghur Muslims in Xinjiang
China’s social media approach to the Uyghur Muslims in Xinjiang exemplifies the shifts in narrative and audience for disinformation. China initially conducted its Western social media engagements via Twitter, largely through state-sponsored media accounts.
The Chinese disinformation campaign began on 19 June 2020 with a reply and retweet referencing Xinjiang, with the retweets as a method to amplify and expand followers.[39] China has specifically pushed a counterfeit narrative to refute reports of genocide, imprisonment, and torture—atrocities dating back to 2017 as determined by former Secretary of State Mike Pompeo.[40] Secretary Pompeo’s anti-CCP stance arguably drove this campaign, after his tweets overtook Chinese state-sponsored accounts for the most likes regarding Xinjiang.[41]
The information laundering model can be overlaid onto subsequent Xinjiang social media efforts. From 2020 to 2021, the use of domestic YouTubers, termed “frontier influencers,” softened messaging and added a facade of authenticity.[42] In contrast to earlier videos of Chinese officials echoing Chinese Communist Party propaganda, these vloggers were usually young women, typically Chinese minorities implicitly speaking for their entire ethnicity or region and showcasing their appearance and dancing skills.[43]
Generation of these videos stems from management agencies called multi-channel networks (MCNs) who produce videos and alter search algorithms through fresh content and frequent postings, while serving to hide their Chinese Communist Party connection.[44] In 2021, multi-channel networks accounted for about 40% of “top-performing accounts with more than 10 million followers on Chinese social media platforms.”[45] One organization, the Xinjiang Audio-Video Publishing House, links to the Chinese Communist Party’s United Front Work Department as the funded arm of Changyu Culture, which in turn were amplified on Twitter and YouTube via fake accounts.[46] A 2022 report from China’s National Radio and Television Administration (NRTA), a member of the CCP’s Central Propaganda Department, identified YouTube as “the main new media platform for the distribution of Chinese short videos overseas” with 169 million fans for the top 100 Chinese YouTube video contents, with the top three channels run by individual influencers.[47]
The effectiveness of these videos is difficult to discern, as some accounts created for the Changyu Culture YouTube channel solely subscribe, rather than watch, the videos—giving the appearance of high engagement.[48] Nonetheless, the integration into YouTube highlights several lessons learned. First, the frequent and fresh content rises to the top of search results with algorithms prioritizing new channels and postings, thus displacing older content.[49] Second, politicizing video titles, even when completely unrelated to the actual content, dilutes searches for credible news and increases video engagements.[50] Third, use of multi-channel networks allows YouTube to monetize the accounts and ensures payment for the creators without identifying them as propaganda, therefore side-stepping state-sponsored account labels.[51]
In line with Xi’s 2021 directive, China’s information laundering evolved to a new phase: the use of foreign influencers. Most notably, in what author Rush Doshi describes as the “information supply chain,” the latest influencers are Western and white: Canadian, British, South African, and American.[52] As of December 2021, The New York Times documented six of these influencers as having 130M views and 1.1M subscribers on YouTube.[53] While the influencers claim to have editorial control and deny being propaganda, these claims are dubious given amplification by Chinese officials and the absence of other foreign reporters in the region.[54] Further, several hallmarks of Chinese influence operations color these videos: use of algorithms and video shares by accounts with no digital background, payments via multi-channel networks, and a pro-China, travelogue-style approach.[55]
An April 2023 search of “Xinjiang” on YouTube demonstrates the efficacy of this approach. The first few suggested videos are from Jason of “Living in China” and Rafa of “Rafa Goes Around,” crowding out credible news on the ongoing genocide and imprisonment. The videos clearly aim at a Western audience, with both men speaking in English and employing upbeat tones, travelogue-style editing, and click-bait titles like “The Xinjiang THEY don’t want you to see…”[56]
Hunter Biden’s Laptop
A second case of malinformation aligns less neatly with the information laundering model and highlights the difficulty with attribution to the Chinese Communist party, thus illustrating how the tactic can escape detection. Gui Wengui, an exiled Chinese billionaire, initially coupled real data recovered from Hunter Biden’s abandoned laptop with false statements that the material came from Chinese sources, depicted criminal sexual activity, and definitively proved the CCP’s control over President Joe Biden.
Guo ran a social media enterprise (Guo Media) originally established to promote anti-CCP narratives, but now noted for its disinformation campaigns on COVID, U.S. election-fraud, and Hunter Biden.[57] Guo had historic ties to Chinese intelligence, with his U.S. exile blurring his status as pro- or anti-CCP.[58] A federal court suit labeling Guo “a dissident hunter, a propagandist, and an agent in service of the People's Republic of China” ended without clear evidence of his position.[59]
Following initial placement of the malinformation on GTV, Guo Media employed layering techniques, using multiple networks such as Discord, GitHub, Google Drive, and both paid and voluntary personnel for translation and content production to amplify messaging through the use of real social media accounts (supporters referenced as “ants”).[60] After leaving the White House, Steve Bannon contracted with Guo’s media company for strategic consulting services to facilitate access for Guo Media to media personalities and to guide the company on industry standards.[61] Bannon and Guo’s interests aligned given the anti-CCP stance both held, which reinforced the active amplification step.[62]
The information laundering process trails through the story’s timeline, though the imperfections peek through with this case. Demonstrating all phases of the process, the Post published its article three weeks after Guo’s network had been airing it, following amplification by Bannon.[63] The Guo media networks, though, aimed at Chinese-speaking audiences with postings in Asia—along with Guo’s anti-CCP narrative—before being picked up by the Washington Examiner and UK’s Daily Mail in a successful integration.[64] The anti-CCP claims did not figure in the more widespread Post story, thus eroding the initial counterfeit narrative and its intended impact. Additionally, Guo’s background adds a wildcard, complicating attribution; either he served as a Chinese intelligence operative, thus successfully hiding the Chinese Communist Party connection, or he operated as an independent entity.
Counter-Information Laundering Recommendations
Countering information laundering must address the primary factors facilitating it: human cognition and social media platform operations. Human cognition factors require a two-pronged approach: near-term incorporation of assessment speed bumps and long-term research on psychological and sociological features. In parallel, social media platforms require a nuanced approach, encouraging voluntary actions as a means to preserve reputational credit. Specific recommendations include the following.
Human Cognition
Fund research on the cognitive, social, and organizational aspects of information laundering, to increase understanding of its effectiveness and operation.[65]
Given that human diffusion accounts for the spread of false news, incorporate “accuracy nudges” on social media platforms.[67]
Add flagging mechanisms of suspect news.[68] Having users self-moderate content is imperfect and subjective, but may alert social media platforms of coordinated activity and potential disinformation.
Social Media Platforms
Use credit fraud monitoring tactics to flag fraudulent activity like spree-posting, a hallmark of coordinated activity. Monitor, identify, and remove automated accounts, bots, spam, and AI-generated profiles and content.[66]
Increase transparency of monetized accounts. MNC-funded accounts may not carry a “state-sponsored content” label but alert the viewer that the content is paid.
Publicize network analysis of viral hashtags and videos, so users can assess whether information originates from single sources, from coordinated activity, or from truly organic grassroots organization.
Conclusion
While Russia sees information space as a constant battleground, China’s operations hinge on “establishing the country as a global hegemon in the international order and on the key objective of maintaining positive global opinion.”[69] China has built an international media empire with state media bureaus, foreign media companies, and partnerships overseas in regions like Africa.[70] Chinese officials contracted for radio broadcasts, worked through diplomats (sometimes called “Wolf Warrior Diplomacy”), submitted op-eds, conducted interviews with U.S. outlets like NPR and Fox News, and paid for China Daily inserts in publications like the Washington Post and the New York Times.[71]
While China is creating and deploying disinformation and malinformation “with the end goal of instilling paranoia, one-dimensional critical thinking, and cognitive blindspots,” the social media campaigns mounted thus far appear to have little or no disruptive impact.[72] The lack of measurable success may therefore support a status quo approach of allowing counterfeit narratives to trickle through the internet.
For U.S. national security, information laundering presents a challenge for threat assessments. While Chinese literature recommends the tactic to conduct public opinion warfare, the current activities fall below the thresholds of both armed conflict and criminal activity. The trajectory of China’s influence operations proves information laundering to be a low-stakes operation with little risk of reprisal, but one that is likely to continue. Following the case study detailed above, for example, President Xi Jinping pushed in July 2022 for the “launch [of a] multi-level, omni-directional, three-dimensional propaganda about Xinjiang direction abroad [and to] perfect the work of ‘inviting in’…and tell China’s Xinjiang story well.”[73]
The evolution of these operations demonstrates increasing skill with a tactic of which the U.S. is largely ignorant and therefore ignoring.[74] As China has proven a propensity to improve abilities, narratives, and approaches, three emerging developments could raise information laundering from an annoyance to an actionable threat.
First, the advent of artificial intelligence (AI) to create user profiles, images, and videos could easily be exploited across the information laundering process.[75] Artificial intelligence could generate believable content for placement, further manipulating confusion between false and true news, and increase engagement through well-crafted fake accounts serving as amplifiers during the layering process—ultimately easing and accelerating integration.
Second, while China has taken advantage of freedoms afforded by Western social media, the increasing popularity of platforms like TikTok flatten the placement and layering process by bringing Western audiences directly to the content source.[76] Thus, blocking apps like TikTok may also slow the laundering process by requiring the Chinese Communist Party to continue to navigate Western platforms where the U.S. retains some measure of control. This also betters the ability of the U.S. to incentivize its platforms to control the spread of misinformation and disinformation through labeling, attribution, and bot removal.
Third, China appears to be expanding into joint campaigns with Russia. The Biden administration has identified disinformation campaigns regarding bioweapons labs in Ukraine, with China amplifying Kremlin propaganda.[77] Further, the campaign has integrated into U.S. mainstream media, following placement and amplification in QAnon networks.[78] The U.S. government must therefore continue to monitor and pre-bunk (i.e., anticipate, disprove, and discredit) upcoming counterfeit narratives in alignment with the Department of State’s Global Engagement Center (GEC) approach.[79]
Given that China’s information laundering falls within the bounds of competition, their activity does not qualify as either criminal or conflict. Thus, a proportionate response at this stage should focus on measures aimed at the factors which facilitate information laundering: human cognition and social media platforms. The U.S. is uniquely positioned to appropriate funds and engage the science and technology communities for research and development efforts. Further, the U.S. government can incentivize social media platforms through multiple methods. For example, legislatively offering tax incentives for costs associated with these recommendations encourages their implementation. Non-legislatively, government officials can drive greater engagement and traffic (which equates to advertising revenue) by patronizing those platforms which remove counterfeit narratives.
Sakeena S. Siddiqi is a Department of Navy civilian with 17 years of acquisition experience spanning commercial and non-commercial acquisitions, major weapon systems, professional services, and policy. The views expressed in this article are those of the author and do not reflect the official policy or position of the Department of the Navy, the Department of Defense or the U.S. Government.
The Strategy Bridge is read, respected, and referenced across the worldwide national security community—in conversation, education, and professional and academic discourse.
Thank you for being a part of The Strategy Bridge community. Together, we can #BuildTheBridge.
Header Image: Social Media Logos in 3D, 2023 (Marila Shalabaieva).
Notes:
[1] Renée Diresta, Carly Miller, Vanessa Molter, John Pomfret, and Glenn Tiffert, Telling China's Story: The Chinese Communist Party's Campaign to Shape Global Narratives, (Stanford, CA: Stanford Digital Repository, 2020), 44, 47, https://doi.org/10.25740/pf306sw8941.
[2] Diresta et al, Telling China’s Story, 44. See also Craig Timberg and Shane Harris, "Chinese Network of Fake Accounts Targets Trump," The Washington Post, August 13, 2020, https://www.proquest.com/newspapers/chinese-network-fake-accounts-targets-trump/docview/2433067427/se-2.
[3] Nathan Beauchamp-Mustafaga and Michael S. Chase, Borrowing a Boat Out to Sea: The Chinese Military’s Use of Social Media for Influence Operations, (Washington, DC: John Hopkins Foreign Policy Institute, 2019), v, https://www.fpi.sais-jhu.edu/borrowing-a-boat-out-to-sea-pdf.
[4] Samantha Korta, “Fake News, Conspiracy Theories, and Lies: An Information Laundering Model for Homeland Security,” Homeland Security Affairs (March 2018): 80, https://www.proquest.com/scholarly-journals/fake-news-conspiracy-theories-lies-information/docview/2206253872/se-2
[5] Ibid.
[6] Korta, “Fake News,” 84.
[7] Korta, “Fake News,” 99.
[8] Noah Arjomand, "Information Laundering and Globalized Media: Part I - The Problem," Center for International Media Assistance (blog), August 20, 2019, https://www.cima.ned.org/blog/information-laundering-and-globalized-media-part-i-the-problem/.
[9] Korta, “Fake News,” 77.
[10] Korta, “Fake News,” 81.
[11] Cindy Otis, “The Mainstreaming of Conspiracy Theories,” interview by Darragh Worland, Is That A Fact, News Literacy Project, audio transcript, https://newslit.org/podcast/the-mainstreaming-of-conspiracy-theories/.
[12] Sanja Kelly, Mai Truong, Adrian Shahbaz, Madeline Earp, Jessica White, Manipulating Social Media to Undermine Democracy (Washington, DC: Freedom House, 2017), 2, https://freedomhouse.org/report/freedom-net/2017/manipulating-social-media-undermine-democracy.
[13] Korta, “Fake News,” 88-89.
[14] Kelly et al, Manipulating Social Media to Undermine Democracy, 15
[15] Korta, “Fake News,” 62, 73
[16] Elena-Alexandra Dumitru, "Is ‘Letting the Truth Get in the Way of a Good Story’ enough? Journalists' Perception on Fake News," Journal of Media Research 14, no. 3 (November 2021): 76, https://www.proquest.com/docview/2615893167.
[17] Vosoughi Soroush, Deb Roy, and Aral Sinan, "The Spread of True and False News Online." Science 359, no. 6380 (March 2018): 1150, https://doi.org/10.1126/science.aap9559.
[18] Vosoughi Soroush et al, "The Spread of True and False News Online,” 1147, 1149.
[19] Vosoughi Soroush et al, "The Spread of True and False News Online,” 1150.
[20] Kenton Thibaut, Chinese Discourse Power: Ambitions and reality in the digital domain (Washington DC: Atlantic Council Digital Forensic Research Lab, 2022), 10, https://www.atlanticcouncil.org/in-depth-research-reports/report/chinese-discourse-power-ambitions-and-reality-in-the-digital-domain/.
[21] Beauchamp-Mustafaga and Chase, Borrowing a Boat, 4.
[22] Beauchamp-Mustafaga and Chase, Borrowing a Boat, 9.
[23] Beauchamp-Mustafaga and Chase, Borrowing a Boat, v.
[24] Thibaut, Chinese Discourse Power, 7.
[25] Diresta et al, Telling China's Story, 7.
[26] Ross Burley, Analysis of the Pro-China Propaganda Network Targeting International Narratives, (London, UK: Centre for Information Resilience, 2021), 4, 7, https://www.info-res.org/post/revealed-coordinated-attempt-to-push-pro-china-anti-western-narratives-on-social-media., 4,7.
[27] Beauchamp-Mustafaga and Chase, Borrowing a Boat, 27.
[28] Timberg and Harris, "Chinese Network of Fake Accounts Targets Trump."
[29] Ibid.
[30] Diresta et al, Telling China's Story, 44
[31] Beauchamp-Mustafaga and Chase, Borrowing a Boat, 94. See also Diresta et al, Telling China's Story, 15.
[32] Diresta et al, Telling China's Story, 20.
[33] Diresta et al, Telling China's Story, 25
[34] Ibid.
[35] Fergus Ryan, Ariel Bogle, Albert Zhang and Dr Jacob Wallis, #StopXinjiang Rumors: The CCP’s decentralised disinformation campaign, (Barton, Australia: Australian Strategic Policy Institute, 2021), 6, https://www.aspi.org.au/report/stop-xinjiang-rumors.
[36] Jeff Kao and Mia Shuang Li, “How China Built a Twitter Propaganda Machine Then Let It Loose on Coronavirus,” ProPublica, March 26, 2020, https://www.propublica.org/article/how-china-built-a-twitter-propaganda-machine-then-let-it-loose-on-coronavirus.
[37] Jeff Kao, Raymond Zhong, Paul Mozur, and Aaron Krolik, “How China Spreads Its Propaganda Version of Life for Uyghurs,” ProPublica and New York Times, June 23, 2021, https://www.propublica.org/article/how-china-uses-youtube-and-twitter-to-spread-its-propaganda-version-of-life-for-uyghurs-in-xinjiang.
[38] Ryan et al, #StopXinjiang Rumors, 43
[39] Ryan et al, #StopXinjiang Rumors, 30
[40] Michael R. Pompeo, “Determination of the Secretary of State on Atrocities in Xinjiang,” Department of State, 2021, https://2017-2021.state.gov/determination-of-the-secretary-of-state-on-atrocities-in-xinjiang/index.html.
[41] Zhang et al, Strange Bedfellows, 6.
[42] Fergus Ryan, Daria Impiombato and Hsi-Ting Pai, Frontier influencers: The new face of China’s propaganda, Report No. 65/2022 (Barton, Australia: Australian Strategic Policy Institute, 2022), 6, 8, https://www.aspi.org.au/report/frontier-influencers.
[43] Ryan et al, Frontier Influencers, 25.
[44] Ryan et al, Frontier Influencers, 3-4
[45] Ryan et al, Frontier Influencers, 36
[46] Ryan et al, #StopXinjiang Rumors, 31
[47] Ryan et al, Frontier Influencers, 11
[48] Zhang et al, Strange Bedfellows, 15
[49] Ryan et al, Frontier Influencers, 33
[50] Ryan et al, Frontier Influencers, 31
[51] Ryan et al, Frontier Influencers, 41. See also Kao et al, “How China Spreads Its Propaganda Version of Life for Uyghurs.”
[52] Thomas Brown, “How China Is Influencing YouTubers into Posting State Propaganda,” Medium, November 16, 2021, https://medium.com/swlh/how-china-is-influencing-youtubers-into-posting-state-propaganda-db72acf18dfa.
[53] Paul Mozur, Raymond Zhong, and Aaron Krolik, "YouTube Influencers are Tools in Beijing's Propaganda Blitz," New York Times, December 14, 2021, https://www.proquest.com/newspapers/youtube-influencers-are-tools-beijings-propaganda/docview/2609572913/se-2.
[54] Ibid.
[55] Ibid.
[56] Jason (@JasonLiving in China), “The Xinjiang THEY Don't Want YOU to see...,” YouTube, March 17, 2023.
[57] Graphika, Ants in a Web, (Graphika: 2021), 3-4, https://graphika.com/reports/ants-in-a-web. See also Jeanne Whalen, Craig Timberg and Eva Dou, “Chinese businessman with links to Steve Bannon is driving force for a sprawling disinformation network, researchers say,” Washington Post, May 17, 2021, https://www.washingtonpost.com/technology/2021/05/17/guo-wengui-disinformation-steve-bannon/.
[58] Dave Davies, "The Inscrutable Aims of Steve Bannon's Enigmatic Chinese Benefactor," NPR, October 20, 2022, https://www.npr.org/2022/10/20/1130184401/the-inscrutable-aims-of-steve-bannons-enigmatic-chinese-benefactor.
[59] Ibid.
[60] Graphika, Ants in a Web, 5-6
[61] Jonathan Swan and Erica Pandey, "Steve Bannon's secret contract with a Chinese billionaire," Axios, October 29, 2019, https://www.axios.com/2019/10/29/steve-bannon-contract-chinese-billionaire-guo-media.
[62] Graphika, Ants in a Web, 22
[63] Graphika, Ants in a Web, 24.
[64] Dan Friedman, " Exclusive: Leaked Messages Reveal the Origins of the Most Vile Hunter Biden Smear," Mother Jones, April 7, 2022, https://www.motherjones.com/politics/2022/04/hunter-biden-laptop-guo-wengui-bannon-giuliani/.
[65] Korta, “Fake News,” 118
[66] Professor Brief to Eisenhower School Seminar on Operations in the Information Environment, “AI and Autonomy,” Stanford University, April 11, 2023.
[67] Nyla Husain, " On Social Media, Sharing Mindset Makes People Worse at Judging Accuracy," American Association for the Advancement of Science, August 6, 2021, https://www.aaas.org/news/social-media-sharing-mindset-makes-people-worse-judging-accuracy.
[68] Ibid.
[69] Diresta et al, Telling China's Story, 41
[70] Lili Pike, “How China Uses Global Media to Spread Its Views—and Misinformation,” Grid News, May 18, 2022, 2.
[71] Diresta et al, Telling China's Story, 9. See also Sarah Cook, "Beijing's Global Megaphone: The Expansion of Chinese Communist Party Media Influence since 2017," (Washington, DC: Freedom House, 2020), 7, https://freedomhouse.org/report/special-report/2020/beijings-global-megaphone.
[72] Thibaut, Chinese Discourse Power, 7
[73] Ryan et al, Frontier Influencers, 9
[74] Ryan et al, Frontier Influencers, 3
[75] Beauchamp-Mustafaga and Chase, Borrowing a Boat, 101
[76] Beauchamp-Mustafaga and Chase, Borrowing a Boat, 104
[77] Edward Wong, " U.S. Fights Bioweapons Disinformation Pushed by Russia and China," The New York Times, March 10, 2022, https://www.nytimes.com/2022/03/10/us/politics/russia-ukraine-china-bioweapons.html
[78] Elise Thomas, “QANON goes to China – via Russia,” Institute for Strategic Dialogue, Last Modified March 23, 2022, https://www.isdglobal.org/digital_dispatches/qanon-goes-to-china-via-russia/.
[79] Department of State, Global Engagement Center, Brief to Eisenhower School Seminar on Operations in the Information Environment, March 17, 2023.