June 14, 2023 | Digital Front Lines

Cyber Resilience Helps Democracies Prevail Against Authoritarian Disinformation Campaigns

The key is to mitigate attacks on communications systems and unmask attempts to corrupt the information environment.
June 14, 2023 | Digital Front Lines

Cyber Resilience Helps Democracies Prevail Against Authoritarian Disinformation Campaigns

The key is to mitigate attacks on communications systems and unmask attempts to corrupt the information environment.

Before Russian troops poured into Ukraine in February 2022, the Kremlin had already begun its war online, using cyber operations to disrupt Ukrainian citizens’ access to information and spread chaos and propaganda. This information warfare, however, has been met with the same resolve and resilience that Ukraine has shown on the battlefield, demonstrating for democracies around the world how to defeat authoritarian disinformation campaigns.

Information warfare often involves denying the adversary—domestic or foreign—access to information. During protests in Iran over the past four years, the regime has repeatedly throttled internet connectivity to try to prevent citizens who are organizing for greater freedoms from communicating with one another and the outside world. Russia has similarly tried—sometimes successfully—to use cyberattacks to disrupt Ukrainian military communications and citizens’ access to information. But Kyiv had planned for such attacks—by, for example, ensuring that alternative systems were in place—and has been able to neutralize the assaults and restart service.

Authoritarian regimes also use cyber-enabled influence operations to contaminate the information environment, pushing false narratives and hiding their own human rights abuses. During the Beijing Olympics, for example, pro-Chinese Communist Party Twitter accounts flooded social media in an attempt to hijack hashtags created by dissidents who were trying to draw attention to abuses in Xinjiang. Throughout its war with Ukraine, the Kremlin has tried to muddy the flow of information to Ukrainians and distort global perceptions of the conflict. Russia has adjusted video evidence to deny war crimes, deployed operators on social media to create fake personas and news sites, and hacked user accounts to promulgate disinformation. Meanwhile, other Russian operations have tried to degrade confidence in the government in Kyiv. Hackers compromised a live Ukrainian news broadcast, inserting a false breaking news chyron claiming that Ukraine had surrendered. But Kyiv has been able to continue its YouTube broadcasts and social media posts to correct the record and reassure Ukrainians that their government still stands.

“Kyiv had planned for such attacks—by, for example, ensuring that alternative systems were in place—and has been able to neutralize the assaults and restart service.”

Countering Disinformation Requires Thwarting Digital Assaults

The Kremlin has failed to control the narrative in Ukraine, because network defenders have kept communication infrastructure online, and the government in Kyiv has demonstrably shown Russia to be lying. Taiwanese civil society groups, meanwhile, are in a pitched battle to counter Chinese Communist Party disinformation about their leaders, among other fake news, because countering information operations requires not just keeping communication lines open but also thwarting the adversary’s attempts to pollute them. Among the ways the Taiwanese have fought back are teaching schoolchildren about media literacy and creating news-verification tools.

This is what operational resilience against information warfare looks like: mitigating attacks on communications systems so the adversary does not have an information monopoly, identifying the online infrastructure authoritarians use to promote false narratives, and unmasking attempts to corrupt the information environment. Future conflicts will see authoritarian states attempting to degrade access to information, control the narrative, and convince the public of the futility of the fight. But if the public can see that the enemy’s attacks are failing because democratic countries have hardened their infrastructure and are quickly detecting the adversary’s digital advances, not only will the enemy’s cyberattacks fail, so will his disinformation campaigns.

Annie Fixler is the director of the Center on Cyber and Technology Innovation at the Foundation for Defense of Democracies and an FDD research fellow. She works on issues related to the national security implications of cyberattacks on economic targets, adversarial strategies and capabilities, and U.S. cyber resilience. She also contributes to the work of FDD’s Transformative Cyber Innovation Lab and Center on Economic and Financial Power. She is the co-editor of FDD’s four-part study on the cyber-enabled economic warfare strategies of America’s authoritarian adversaries. Her work has appeared in Defense One, The Hill, and The National Interest, among other publications. Follow Annie on Twitter @afixlerFDD is a Washington, D.C.-based, nonpartisan research institute focusing on national security and foreign policy.

Issues:

Cyber