Australian researchers find dozens of channels using AI-generated audio and stock images to sway public opinion.
To continue reading the rest of this article, please log in.
Create free account to get unlimited news articles and more!
The Australian Strategic Policy Institute (ASPI) has published a report on its investigations into Chinese influence operations on YouTube, and the results are quite staggering.
In a period between October and November 2023, ASPI’s researchers found a network of at least 30 YouTube channels publishing content that was clearly designed to spread disinformation regarding a range of topics.
The channels shared common production techniques, such as using AI-generated voiceovers and even hosts (which ASPI had not seen used at this scale in previous campaigns) and heavy use of stock and rendered images, while many of the videos are often posted within a similar time window.
Moreover, quite a few of the videos feature the same or similar thumbnails and use the same list of search keywords.
Many of the videos promote a similar narrative, one that says China is beating the United States in a “technology war”, and feature Chinese companies like Huawei. Similarly, some focus on topics that talk down the achievements of US companies such as Apple, while others promote the uptake of Chinese-backed infrastructure projects.
But what sets this network – which ASPI has dubbed Shadow Play – apart is its broad reach.
“The Shadow Play campaign involves a network of at least 30 YouTube channels that have produced more than 4,500 videos,” ASPI’s report said. “At time of publication, those channels have attracted just under 120 million views and 730,000 subscribers. The accounts began publishing content around mid-2022.”
“The campaign’s ability to amass and access such a large global audience – and its potential to covertly influence public opinion on these topics – should be cause for concern.”
One such campaign focused on the sale of buses made in China to Nicaragua, hardly a world-changing topic, but one that nevertheless featured on four different YouTube channels: Current Insight, Global Visionary, China Secrets, and Insider Project. Thirty seconds into each video, the same English-language news page was shown, suggesting they were all slightly re-edited versions of the same video, cut to get around spam detection.
Another set of channels all shared the same title sequence, while another set of videos from supposedly different channels all featured the same render of a bombed-out Palestinian building.
The channels also commonly referenced each other in their keyword selections.
“For example,” the report said, “China Hub, China Charged, Next-Gen Innovations, and Relaxian all mention the China Focus channel; Sinosphere, Curious Bay, East to West and Innovative Check mention Innovation Diary, and World Lens mentions Deepin Moments. We also noted that Asian Quicktake called itself Vision of China, another channel in the network, in its earlier videos”.
First signs
ASPI first noted an X account that likely belongs to a real user sharing content from YouTube channel Deepin Moments in late October. The video in question was regarding China’s BeiDou satellite network – equivalent to the GPS network – and its possible use by Iran. The tweet in question was made hours after the BeiDou video was posted, and it garnered more than 200,000 views, illustrating the reach of the Shadow Play network, especially given that the topic had not been broached by any other news service.
“We confirmed that these narratives hadn’t previously appeared on the internet by conducting another structured search that looked at content from the previous three years,” the report said. “We then conducted a suite of other structured searches across multiple online platforms.”
By looking at the proliferation of tweets mentioning BeiDou and the spread of the Deepin Moments video, ASPI uncovered hints of the wider Shadow Play network. Once a few channels had been discovered, ASPI created its own new YouTube account and trained its recommendation algorithm based on videos from the previously discovered channels. More channels with similar production techniques and narratives were soon discovered.
By looking at the metadata for these channels and their posts and a “comprehensive discourse or narrative analysis”, ASPI’s researchers came to several conclusions.
First, the actor behind the videos is very likely a native Mandarin speaker, based on analysis of the English used in the videos often being direct translations of Chinese phrases. This actor is also familiar with the Chinese mainland news cycle since many of the topics were based on articles first published in China.
Thirdly, ASPI thinks it is “likely” the actor is either state-directed or -supported. Much of the content supports China’s wider geopolitical goals of downplaying US influence and supporting its allies, such as Russia.
“Similar alignment can be seen in the effort to present China’s partners, such as Russia (and to a lesser extent Iran), as responsible, capable geopolitical actors,” the report said.
Once ASPI had made its findings, it contacted YouTube over the network. Subsequently, many of the accounts were banned and the videos were removed over terms of service violations for spreading misinformation.
What comes next
Based on its reach alone, ASPI considers the Shadow Play network to be one of China’s most successful social media disinformation campaigns.
To counter this and future campaigns, ASPI recommends that social media platforms introduce transparency rules over the use of generative artificial intelligence (AI) in content creation and to investigate such campaigns using their own in-house “technical advantages”. Stock image providers and text-to-speech services should also monitor more closely who is using their content and how it is being employed. The terms of service of such providers should be updated to forbid the use of such content to spread disinformation.
ASPI also believes that the Five Eyes intelligence alliance should provide detailed information to policymakers on such operations and that investigations into some similar operations should be declassified to share with partners, in both public and private capacities.
Finally, “law-enforcement agencies and foreign ministries should consider public attribution of information operations or threat actors, particularly those that target non-English-speaking audiences. This would also increase public awareness of what information operations look like, leading to improved societal resilience against such operations.”
Alarmingly, ASPI feels that this is merely the tip of the iceberg. The network itself is likely larger than the 30 channels it has identified, and non-English accounts are posting similar content in languages such as Bahasa Indonesia, a language related to Malay.