Russian Disinformation Campaign 'Doppelganger' is Why Doppel Exists

The DoppelGänger campaign is a stark reminder of why Doppel exists – to match the speed of generative AI used in attacks to protect organizations.
Doppel Team
|
September 18, 2024

The Russian disinformation campaign known as "DoppelGänger" was unmasked earlier this year as a generative AI-fueled, clandestine operation targeting Western media and society. While the campaign was first exposed by the European Union’s Disinformation Lab (EU DisinfoLab), the U.S. Justice Department on September 4th seized 32 web domains used by DoppelGänger, citing violation of U.S. money laundering and criminal trademark laws.

According to a Justice Department release, in conjunction with the domain seizures, The U.S. Treasury Department announced the designation of 10 individuals and two entities as part of a coordinated response to Russia’s malign influence efforts targeting the 2024 U.S. presidential election.

The DoppelGänger campaign is a stark reminder of why Doppel exists – to match the sophistication and speed of generative AI used in attacks to protect elections, brands and people participating in the democratic process.

“Social engineering is the greatest threat vector with AI,” Doppel CEO, Kevin Tian said. “We're on a mission to combat these multimodal AI attacks with good AI.”

The Anatomy of DoppelGänger

Operating under the guise of legitimacy, DoppelGänger mimicked real news outlets to promote pro-Russian narratives.

The DoppelGänger campaign began operating in May 2022, utilizing the resources of the two entities designated by the Treasury Department: Russian Social Design Agency and Structura National Technologies. These organizations worked in tandem to craft a network of false information, carefully camouflaged as legitimate news. By cloning media websites and using generative AI, DoppelGänger created articles and content that mirrored authentic news sources. This approach was so seamless that even experienced readers could mistake these fake articles for genuine journalism.

Among the prominent media outlets targeted by the campaign were Bild, The Guardian, 20minutes, Ansa, and RBC Ukraine. These cloned websites spread narratives favorable to Russia, distorting public perception of key international issues such as the war in Ukraine and sanctions imposed on Russia by the West.

Tools of Deception: AI, Fake Domains, and Social Media Impersonation

The strength of the DoppelGänger campaign lay not just in its content but in its tools and strategies. Generative AI played a pivotal role in creating articles that appeared indistinguishable from authentic news. The campaign also purchased domain names that closely resembled legitimate news outlets, making it easy to fool the unsuspecting reader.

For example, as provided by U.S. Cyber Command, a domain such as "nato[.]ws" was designed to mimic NATO’s official website. The site published press releases containing false claims, such as NATO’s purported plans to double its military budget and the alleged deployment of Ukrainian paramilitary troops to suppress protests in France. Links to the real NATO website were also embedded to lend credibility to the fake news articles.

U.S. Cyber Command also noted that another domain, “RRN[.]media,” was originally registered as "russianews[.]com" and was used to disseminate pro-Russian narratives. The content on this site, disguised as “fact-checked” news, falsely claimed that Western sanctions had no effect on Russia and exaggerated the negative impact of Ukrainian refugees in Europe.

The campaign didn't stop at cloned websites. DoppelGänger also employed social media bots to amplify its false narratives. These bots worked across platforms like Facebook and Twitter, sharing articles, polls, and videos designed to manipulate public opinion. Additionally, the campaign likely invested in sponsored posts to increase the visibility of its fake content, bypassing the filters and moderation efforts of social media companies.

Pro-Russian Narratives and their Targets

DoppelGänger’s objectives were clear: to promote pro-Russian narratives and sow discord among western populations. One of the campaign’s key narratives was to depict Ukraine negatively. Ukraine was falsely portrayed as a corrupt, Nazi-affiliated state, while the Russian government’s actions were justified. Another example of disinformation was the denial of the Bucha massacre, an atrocity committed during the Russian invasion of Ukraine that received widespread condemnation.

Fearmongering also played a significant role in the campaign. European citizens—particularly those in Germany, France, Italy, Latvia, U.K and U.S—were targeted with messages designed to stir fear and resentment.  

These false narratives were designed to blur the lines between fact and fiction. By seamlessly embedding disinformation into media platforms, DoppelGänger eroded the credibility of legitimate news outlets and stoked public distrust in democratic institutions.

The Need for Next Generation Digital Risk Protection

As the DoppelGänger case demonstrates, disinformation knows no borders. It can easily penetrate media landscapes across the globe, affecting citizens, journalists, and policymakers alike. This global reach demands a coordinated, collective response from governments, media organizations, and tech companies.

Generative AI-fueled digital risk protection (DRP) platforms like Doppel shield leaders and brands from digital threats, allowing them to focus on their campaigns, organizations, and businesses with confidence.

Like the AI employed by attackers, the Doppel platform continuously learns, improving with each threat it neutralizes, ultimately taking a proactive approach to DRP. If DoppelGänger has proven anything, it’s that reactive defenses are no longer enough to combat this new breed of disinformation and social engineering campaigns.

Fighting Disinformation: What DoppelGänger Can Teach Us

The DoppelGänger campaign is a vivid representation of the threats posed by modern disinformation campaigns. The use of generative AI, fake domains, and social media impersonation creates a complex web of falsehoods that can fool even the most discerning readers. As the campaign fall-out continues to unfold, it serves as a warning that the fight against disinformation is far from over.

Understanding how these campaigns work is the first step toward combating them. By staying vigilant and employing digital risk protection solutions, society can resist the corrosive effects of disinformation and safeguard democratic values. Ultimately, DoppelGänger’s legacy may serve as a catalyst for stronger defenses against the growing threats of digital impersonation and social engineering.

Ready to learn more?