BROADCAST: Our Agency Services Are By Invitation Only. Apply Now To Get Invited!
ApplyRequestStart
Header Roadblock Ad
Russian Disinformation Campaigns
Disinformation

Russian Disinformation Campaigns In Last 10 Years: Investigation Of The ‘Doppelganger’ Evolution”

By Ekalavya Hansaj
March 5, 2026
Words: 17841
0 Comments

Why it matters:

  • The Russian Disinformation Campaigns, known as "Doppelganger," signal a significant shift in state-sponsored information warfare towards cloning Western media outlets.
  • This operation, identified in September 2022, involves techniques like typosquatting to create fake news sites resembling legitimate sources, enabling the dissemination of Kremlin-approved disinformation.

The Russian Disinformation Campaigns a.k.a “Doppelganger” protocol represents a structural shift in state-sponsored information warfare, moving beyond crude bot farms to the systematic cloning of the Western media ecosystem. identified by EU DisinfoLab in September 2022, the operation relies on a technique known as “typosquatting” to hijack the credibility of established news organizations. Russian operatives register domain names that bear a near-identical resemblance to legitimate outlets, such as bild. ltd instead of bild. de or washingtonpost. pm instead of washingtonpost. com, and populate them with sophisticated forgeries of real news sites. These clones replicate the design, bylines, and advertising slots of the target publications, creating a direct façade for disseminating Kremlin-approved disinformation.

The architecture of this campaign is not decentralized strictly hierarchical. Investigations by Meta and the U. S. Department of Justice (DOJ) have attributed the operation to two specific Moscow-based firms: the Social Design Agency (SDA) and Structura National Technologies. These entities operate under the direct supervision of the Russian Presidential Administration, specifically reporting to Deputy Chief of Staff Sergei Kiriyenko. Unlike previous influence operations that relied on fringe blogs, Doppelganger parasitizes the authority of the mainstream press. By September 2024, the U. S. DOJ had seized 32 domains explicitly linked to this network, citing violations of money laundering and criminal trademark laws.

The operational mechanics reveal a high degree of technical coordination. The campaign employs “Keitaro,” a traffic distribution system that filters visitors based on their geolocation. If a user clicks a malicious link from a target country, they are served the disinformation content; if they access it from elsewhere, they are redirected to a neutral page to evade detection by security researchers. This geofencing capability allows the network to surgically target audiences in Germany, France, and the United States without triggering immediate global alarms. In January 2024, German authorities uncovered a network of over 50, 000 fake user accounts on X (formerly Twitter) dedicated solely to amplifying these cloned links, generating massive engagement metrics to trick social media algorithms.

The content strategy has evolved from manual fabrication to automated generation. Early iterations involved human operatives translating Russian propaganda, 2024 analysis indicates a heavy reliance on generative AI to produce articles. These AI models draft stories that mimic the tone and style of the target publication, frequently mixing real news with fabricated quotes to blur the line between fact and fiction. The campaign does not spread “fake news” in the traditional sense; it injects specific narratives, such as the alleged futility of sanctions or the corruption of Ukrainian leadership, into the visual language of trusted journalism.

Verified Doppelganger Domain Clones (2022, 2024)

Targeted Outlet Legitimate Domain Doppelganger Clone Target Audience
Bild bild. de bild. ltd, bild. eu Germany
The Washington Post washingtonpost. com washingtonpost. pm United States
Le Monde lemonde. fr lemonde. ltd France
The Guardian theguardian. com theguardian. co. com United Kingdom
NATO nato. int nato. ws International
Der Spiegel spiegel. de spiegel. ltd Germany

The persistence of the Doppelganger network shows its strategic value to Moscow. Even after the September 2024 seizures, researchers at the Digital Forensic Research Lab (DFRLab) observed new domains appearing within 24 hours, frequently migrating content from the seized sites to new top-level domains like . cc or . pw. This rapid recidivism indicates a dedicated infrastructure designed for resilience, treating domain seizures as a calculated operating cost rather than a terminal disruption. The campaign’s focus has also widened; while initially targeting European support for Ukraine, late 2024 operations aggressively targeted the U. S. electorate, aiming to deepen political polarization ahead of the presidential election.

Financial records exposed in court filings reveal the of investment. The Social Design Agency budgeted millions of dollars for these operations, paying for social media ads to force-feed their cloned articles into the feeds of unsuspecting users. Unlike organic viral misinformation, Doppelganger is a “pay-to-play” system, purchasing visibility to ensure its fabricated narratives reach serious mass before fact-checkers can intervene. This commercialization of disinformation marks a dangerous evolution, transforming state propaganda into a, industrial process.

Genesis 2022: The Initial Euro-Atlantic Target List Of Russian Disinformation Campaigns

The “Doppelganger” campaign did not emerge as a fully formed global operation rather as a surgical strike against specific European vulnerabilities. Between May and September 2022, Russian operatives launched the initial phase of what would become the largest disinformation campaign of the decade. This period, defined by the registration of dozens of “typosquatted” domains, focused almost exclusively on the Euro-Atlantic axis, with a strategic emphasis on Germany, France, and Italy. The objective was immediate and tactical: to fracture the unified Western response to the invasion of Ukraine by fabricating dissent within its most member states.

Unlike previous “troll farm” operations that relied on generic blog posts or social media comments, this phase introduced high-fidelity cloning. Operatives from Structura National Technologies and the Social Design Agency (SDA), acting under the direction of the Russian Presidential Administration, created pixel-perfect replicas of trusted national newspapers. These clones were not parodies; they copied the CSS code, byline styles, and even the advertising slots of the original sites to deceive casual readers. The campaign’s infrastructure relied on “burner” Facebook accounts, single-use profiles that would post a link to a fake article and then be abandoned or banned, a tactic designed to overwhelm platform moderation systems through sheer volume.

The Primary Target Matrix (May, Sept 2022)

The initial target list reveals a clear geopolitical strategy: focus on the “Weimar Triangle” (France, Germany, Poland) and key NATO allies to public support for sanctions. The following table details the verified media outlets targeted during this genesis window, along with the specific fraudulent domains registered to host the content.

Target Country Media Outlet Cloned Fraudulent Domain Examples Primary Narrative Focus
Germany Bild, Der Spiegel, T-Online bild. ltd, bild. xu, spiegel. ltd, t-online. pro Energy prices, “economic suicide” via sanctions, refugee crime scares.
France Le Monde, Le Parisien, 20 Minutes, Le Figaro lemonde. ltd, 20min. neu, leparisien. ltd Anti-Macron sentiment, futility of Ukraine support, fake government taxes for war.
Italy ANSA (National News Agency) ansa. ltd, ansa. pro Sanctions hurting Italian businesses, energy absence.
Ukraine RBC Ukraine, UNIAN rbc. ua. ltd, unian. org Corruption in Kyiv, military incompetence, “surrender is inevitable.”
United Kingdom The Guardian, Daily Mail theguardian. co. com, dailymail. ltd Cost of living emergency linked to Ukraine support.
NATO Official Website nato. ws Fake press releases about deploying troops to suppress protests.

Narrative Warfare: Specific Fabrications

The content hosted on these domains went beyond vague criticism; it involved specific, falsified news events designed to trigger outrage. In France, a cloned Le Monde site published an article with the headline “French Minister supports the murder of Russian soldiers in Ukraine,” a complete fabrication intended to paint the French government as bloodthirsty escalators of the conflict. Similarly, a fake Ministry of Foreign Affairs page announced a non-existent 1. 5% tax on “every monetary transaction” to finance military aid to Kyiv, a narrative tailored to ignite economic anxiety among the French working class.

In Germany, the campaign exploited fears of a cold winter. Fake Bild articles claimed that sanctions against Russia would lead to the total collapse of German industry and that Ukrainian refugees were responsible for a spike in violent crime. The nato. ws domain took this a step further, publishing a counterfeit press release claiming NATO planned to double its military budget and deploy Ukrainian paramilitary troops to France to “suppress” civilian protests, a narrative designed to frame the alliance as an occupying force within its own member states.

Operational Metrics and Ad Spend

Data from Meta and EU DisinfoLab provides a window into the of this initial push. In the second quarter of 2022 alone, Meta identified and removed a Russian network that spent approximately $12, 000 USD on advertisements to promote these links. While the dollar amount appears modest, the volume of activity was substantial due to the low cost of automated “burner” accounts.

Verified traffic data for the 2022 period indicates a heavy skew toward Germany and France:

  • Germany: Approximately 2, 250 distinct campaigns generating over 250, 000 clicks.
  • France: Approximately 2, 245 distinct campaigns generating over 249, 000 clicks.
  • Ukraine: 1, 339 campaigns generating nearly 149, 000 clicks.

These numbers represent only the detected engagement. The campaign utilized sophisticated “geofencing” techniques to hide its tracks. If a user clicked a link from a targeted country (e. g., a French IP address clicking a Le Monde clone), they were taken to the fake article. If a researcher or moderator clicked from a non-target IP (e. g., the US), they were redirected to a benign page or a 404 error, complicating early detection efforts. This technical allowed the operation to for months before the full scope of the “Doppelganger” network was publicly unmasked in September 2022.

Digital Forensics: DNS Spoofing and Typosquatting Metrics

The technical architecture of the Doppelganger campaign relies on a high-velocity “burn-and-churn” domain strategy designed to overwhelm Western countermeasures. Unlike traditional botnets that rely on compromised IoT devices, Doppelganger’s infrastructure is built on legitimate, albeit weaponized, commercial web services. Forensic analysis of the 32 domains seized by the U. S. Department of Justice in September 2024 reveals a sophisticated procurement network that prioritizes speed and obfuscation over longevity. The campaign’s operators, identified as the Social Design Agency (SDA) and Structura National Technology, use a technique known as “homoglyphic typosquatting” to deceive users at the DNS level.

Between 2022 and 2025, the operation systematically exhausted specific Top-Level Domains (TLDs) that visually approximate legitimate media extensions. While standard typosquatting might rely on misspelling a brand name (e. g., washngtonpost. com), Doppelganger operatives instead manipulate the TLD itself. Forensic data shows a heavy reliance on . ltd, . pm, . wf, and . pics. For instance, the German outlet Bild was mimicked using bild. ltd and bild. pics, while the Washington Post was spoofed via washingtonpost. pm. These TLDs are frequently cheaper to register in bulk and attract less immediate scrutiny than . com or . org domains.

The resilience of this infrastructure is quantified by its “regeneration rate.” Following the September 4, 2024, DOJ seizures, forensic monitoring by the DFRLab confirmed that the network successfully launched 12 new mirror sites within 24 hours. These replacement domains immediately shifted to alternative TLDs including . cc, . co, . pw, and . so, demonstrating a pre-planned redundancy. The table outlines the forensic characteristics of key domains identified during this period.

Table 3. 1: Doppelganger Domain Forensics & Seizure Data (2023-2025)
Targeted Outlet Spoofed Domain Registrar / Host Technical Anomaly Status (2025)
The Washington Post washingtonpost. pm Namecheap / Cloudflare Mismatched TLD (. pm is St. Pierre & Miquelon) Seized (DOJ)
Bild (Germany) bild. ltd NameSilo Redirects via Keitaro TDS Seized (DOJ)
Le Monde (France) lemonde. ltd Nameshield SAS Geo-fenced to French IPs Seized (DOJ)
Fox News foxnews. wf PDR Ltd. Wallis and Futuna TLD usage Inactive
RBC Ukraine rbc. ua (cloned content) Reg. ru Hosted on Russian ASN Active / Rotating

The backend infrastructure relies heavily on Traffic Distribution Systems (TDS), specifically the Keitaro tracker. This software, frequently used in affiliate marketing, allows the operatives to filter incoming traffic based on the user’s IP address and device fingerprint. If a request originates from a targeted country (e. g., Germany or France), the TDS serves the disinformation content. If the request comes from a known security researcher, a bot crawler, or an IP address outside the target zone, the system redirects to a benign page or returns a 404 error. This “cloaking” technique complicates automated detection and explains why of these domains remain active for weeks before being flagged by platform safety teams.

Volume analysis from May 2024 indicates the campaign reached an industrial of production, publishing approximately one new fake article every 50 minutes. This output is supported by a network of “keystone” domains, such as warfareinsider. us and electionwatch. live, which serve as aggregators for the spoofed content. These sites do not mimic a specific brand act as content hubs that are then amplified via social media botnets. The DOJ affidavit revealed that the network utilized at least 10 different registrars, with Namecheap accounting for 14 of the 32 seized domains, highlighting a strategy of vendor diversification to prevent a single point of failure.

also, the use of “burn” domains for redirection has become a standard operating procedure. Instead of linking directly to the spoofed site on social media, the campaign uses disposable intermediate domains (frequently hosted on . xyz or . site) that instantly redirect the user to the final destination. This DNS obfuscation protects the primary spoofed domain from immediate blacklisting by social media platforms. By late 2024, investigators identified over 6, 000 unique threat indicators associated with this single operation, confirming that Doppelganger has evolved from a nuisance into a persistent, hydra-headed technical threat.

The Architects: Social Design Agency and Structura National Technologies

The operational core of the Doppelganger campaign is not a loose shared of hacktivists, a rigid corporate structure contracted directly by the Kremlin. Two Moscow-based firms, Social Design Agency (SDA) and Structura National Technologies, serve as the primary architects. These entities operate as “influence-for-hire” contractors, dividing the labor of disinformation into distinct ideological and technical disciplines under the supervision of Sergei Kiriyenko, the Deputy Chief of Staff of the Russian Presidential Executive Office.

Social Design Agency (SDA), founded by Ilya Gambashidze, functions as the campaign’s editorial and creative hub. Leaked internal documents reveal that SDA employs a staffed hierarchy of “ideologists,” “commentators,” and “bot farm operators” who work to strict production quotas. For a single project targeting France and Germany, SDA operatives were assigned monthly: 60 cartoons, 180 memes, and 400 article comments. This industrial- content generation is designed to overwhelm organic discourse with fabricated narratives.

Structura National Technologies, led by CEO Nikolai Tupikin, provides the technical infrastructure that makes this content visible. Structura is responsible for the “typosquatting” method, registering domains that mimic legitimate news outlets, and managing the server networks that host these forgeries. Their technical role extends to the deployment of obfuscation scripts that redirect users from social media to these cloned sites, masking the traffic’s origin from platform moderators.

Project “Good Old USA”

In September 2024, the US Department of Justice unsealed an affidavit exposing a specific SDA initiative titled “Project Good Old USA.” This operation was explicitly designed to interfere in the 2024 US Presidential Election. Unlike general anti-Western propaganda, this project had precise, documented objectives: to reduce US support for Ukraine, pro-Russian policies, and exacerbate internal political divisions. The affidavit detailed how SDA operatives monitored the US information environment to identify “pain points”, such as inflation or border security, and weaponized them using the Doppelganger infrastructure.

Entity / Individual Role Sanction Date (US) Sanction Date (EU)
Social Design Agency (SDA) Content creation, strategy, “ideology” March 20, 2024 July 28, 2023
Structura National Technologies Technical infrastructure, domain registration March 20, 2024 July 28, 2023
Ilya Gambashidze Founder of SDA March 20, 2024 July 28, 2023
Nikolai Tupikin CEO of Structura March 20, 2024 July 28, 2023

Financial records trace the flow of resources to these operations. Blockchain analysis identified approximately $200, 000 in cryptocurrency transferred to wallets controlled by Gambashidze, a portion of which moved through Garantex, a sanctioned exchange. also, Meta reported that these entities spent approximately $105, 000 on Facebook and Instagram advertisements to amplify their content. While these sums appear modest compared to military budgets, they represent high-use spending in the asymmetric domain of information warfare, where a single viral forgery can reach millions of voters.

The coordinated action against these firms culminated in late 2024. Following the US Treasury sanctions in March, the UK government followed suit on October 28, 2024, freezing assets and imposing travel bans. In September 2024, the US Department of Justice seized 32 domains used by SDA and Structura, severing of their distribution network. These seizures confirmed that the “Good Old USA” project was not a proposal an active, funded operation targeting the American electorate.

Profile: Ilya Gambashidze and the Kremlin Contract

Anatomy of Deceit: Defining the Doppelganger Protocol
Anatomy of Deceit: Defining the Doppelganger Protocol

At the center of the Doppelganger operation stands Ilya Andreevich Gambashidze, a Moscow-based political technologist who has transitioned from domestic electioneering to orchestrating one of Russia’s most aggressive foreign influence campaigns. Born in Kyiv in 1977 and holding a PhD in sociology, Gambashidze founded the Social Design Agency (SDA), a firm that ostensibly offers media monitoring and consulting functions as a primary contractor for the Kremlin’s information warfare. Unlike the chaotic troll farms of the mid-2010s, Gambashidze’s operation is professionalized, bureaucratic, and directly integrated into the Russian Presidential Administration’s command structure.

Gambashidze’s career trajectory aligns with the Kremlin’s shifting priorities. Formerly an assistant to the prefect of Moscow’s Northern District and an advisor to Pyotr Tolstoy, Deputy Chairman of the State Duma, he spent years managing regional elections and working with the Liberal Democratic Party of Russia (LDPR). By 2022, his focus had pivoted entirely to the West. Leaked internal documents from the SDA, obtained by European intelligence and media consortiums in 2024, reveal that Gambashidze views himself as a military commander in the information space. In internal video presentations, he has been recorded wearing a camouflage hoodie with a patch reading “Russian Ideological Troops” and “Commander of Special Forces,” explicitly framing his work as combat operations.

The Kiriyenko Link and State Contracts

The operational authority for Gambashidze’s campaigns flows from the highest levels of the Russian state. Intelligence disclosures from the U. S. Department of Justice and European agencies confirm that the SDA operates under the direct supervision of Sergei Kiriyenko, the Deputy Chief of Staff of the Presidential Administration. This relationship is not ideological contractual. Between 2013 and 2020 alone, companies linked to Gambashidze received verified Russian government contracts worth approximately $2. 7 million. Following the 2022 invasion of Ukraine, this funding stream expanded to support the Doppelganger infrastructure.

The “contract” entails specific deliverables monitored through rigorous Key Performance Indicators (KPIs). Leaked files show that the SDA is required to report on metrics such as the number of “cloned” articles published, engagement rates on social media, and the successful injection of specific narratives into Western discourse. The operation is run with corporate efficiency; internal records from early 2024 indicate that in a single four-month period (January to April), Gambashidze’s team produced exactly 39, 899 distinct pieces of content, including:

SDA Content Production Output (Jan, Apr 2024)
Content Type Quantity Primary
Social Media Posts 30, 000+ Facebook, X (Twitter), Telegram
Videos & Video Memes 4, 600+ YouTube, TikTok, Instagram Reels
Memes & Infographics 2, 500+ Viral distribution channels
Fake Articles/Longreads 1, 500+ Typosquatted domains (e. g., bild. ltd)

Sanctions and International Exposure

The of Gambashidze’s operations led to his designation by multiple Western sanctions regimes. The Council of the European Union sanctioned him in July 2023, citing his role in manipulating information to support Russia’s war of aggression. The United States Department of the Treasury followed suit on March 20, 2024, designating Gambashidze and the SDA for engaging in a “foreign malign influence campaign.” The U. S. Treasury’s action specifically highlighted his use of cryptocurrency to obfuscate the funding trails for the fake domains, freezing wallets holding over $200, 000 in assets.

even with these measures, Gambashidze has maintained his operational tempo. His firm continues to adapt, utilizing “burner” accounts and constantly registering new domains to evade blocking. The leaked documents reveal a mindset of persistent warfare, where sanctions are viewed not as a deterrent as validation of the “Ideological Troops” effectiveness. His personal connections to the West remain a paradox; while his professional life is dedicated to Western institutions, verified records indicate his ex-wife and two sons have resided in the United States, a fact that show the compartmentalized nature of the modern Russian elite.

Follow the Money: Ad Spend Analysis on Meta Platforms

The “Doppelganger” campaign is not a guerrilla operation of organic trolls; it is a paid commercial enterprise. Verified data from Meta’s own adversarial threat reports and independent forensic analysis confirms that Russian operatives have injected at least $485, 000 into Facebook and Instagram advertising systems between September 2022 and October 2024. This figure represents only the identified spend attached to specific clusters, suggesting the true financial footprint is likely significantly higher. Unlike the Internet Research Agency’s 2016 campaign, which relied heavily on viral organic reach, Doppelganger’s operators, specifically the Social Design Agency (SDA) and Structura National Technologies, treat disinformation as a pay-to-play marketing funnel.

Verified Doppelganger Ad Spend Events (2022, 2024)
Period Identified Entity Est. Spend (USD) Reach / Volume Targeting Focus
Sep 2022 Structura / SDA $105, 000+ Initial Launch Cluster Germany, France, Ukraine
Aug 2023 , Oct 2024 Social Design Agency $338, 000 8, 000+ Ads Europe, USA, Israel
Q1 2024 “Doppelganger” Cluster $42, 000 38 Million Accounts France, Germany
Total Verified , $485, 000+ 46M+ Impressions Global West

The payment infrastructure reveals a persistent ability to bypass Western sanctions. even with SDA and Structura being sanctioned by the European Union and the United States, these entities successfully purchased ads using US Dollars and Euros. They utilized a “burn-and-churn” method for ad accounts, deploying disposable Facebook Pages with generic or nonsensical names such as “Ytyqyq online shop,” “Reliable Recent News,” and “Tribunal Ukraine.” Once Meta’s automated systems flagged a page, the operatives simply discarded it and moved to the pre-registered account, maintaining a constant stream of paid promotion. This “Whac-A-Mole” allowed the campaign to reach over 38 million accounts in France and Germany alone during a single seven-month window in late 2023 and early 2024.

“The campaign deployed obfuscated URLs that redirected users to domains, sometimes geofenced, impersonating Ukrainian media outlets… The ads and pages in the campaign were removed from the platform within one day, yet the operation remains diffusive and at large.” , Digital Forensic Research Lab (DFRLab), March 2024

To mask the destination of this traffic, the campaign employed sophisticated “cloaking” techniques. Forensic analysis identified a service known as Kehr, which managed at least 147 disposable domains like radilwanised. shop and shuanse. shop. These intermediate links acted as a filter: if a content moderator or bot clicked the link, they saw a benign page; if a targeted user clicked, they were redirected to a typosquatted disinformation site such as bild. ltd or washingtonpost. pm. This technical allowed the ads to survive Meta’s initial review processes, ensuring that the $485, 000 spend delivered maximum disruption before detection.

The economics of this operation are clear. For less than half a million dollars, a rounding error in military budgets, Russian state actors purchased direct access to tens of millions of Western citizens. The cost-per-impression was negligible compared to the strategic value of eroding trust in democratic institutions. also, the fact that Meta accepted payments from entities under strict international sanctions raises serious compliance questions. While the platform eventually removed thousands of assets, the revenue generated from these “high-risk” ads remained with the company, meaning Western tech platforms were paid to host the very infrastructure designed to them.

The RRN Pivot: From Clones to Reliable Recent News

By mid-2022, the Doppelganger campaign executed a strategic pivot that fundamentally altered its operational DNA. While the initial phase relied on the brute-force cloning of established Western media brands, the second phase introduced a centralized content engine designed to mimic the aesthetics of independent journalism. This new architecture centered on a portal named “Reliable Recent News” (RRN), a site that served not as a repository for disinformation as a credibility-laundering machine for the Kremlin’s most aggressive narratives.

The RRN pivot addressed a serious weakness in the earlier cloning strategy: the ephemeral nature of typosquatted domains. When a fake Bild or Le Monde domain was seized or blacklisted, the content. RRN provided a persistent “home base” for these narratives, allowing Russian operatives to link back to a site that appeared, to the untrained eye, as a legitimate alternative news source. This hub-and-spoke model allowed the network to survive individual domain takedowns while maintaining a consistent stream of pro-Kremlin content.

Weaponizing the “Fact-Check” Format

The most insidious innovation of this phase was the weaponization of the “fact-check” itself. Recognizing that Western audiences had been trained to trust fact-checking organizations, Russian operatives launched “War on Fakes” (Voina s Feykami), a multi-platform initiative that adopted the visual language of verification, red stamps, “debunked” labels, and side-by-side photo comparisons, to spread disinformation. This was not a defensive measure; it was an offensive inversion of truth, where documented Russian war crimes were “fact-checked” into non-existence.

Data from the Atlantic Council’s Digital Forensic Research Lab (DFRLab) confirms the explosive growth of this tactic. The “War on Fakes” Telegram channel, launched on February 23, 2022, one day before the full- invasion, grew from 161 subscribers to over 625, 000 within two weeks. By March 2022, its content was generating over 30 million daily views, a metric that rivals legitimate major news organizations.

Table 7. 1: “War on Fakes” Telegram Growth & Reach (Feb-Mar 2022)
Source: DFRLab, Global Influence Operations Report
Date Event Subscriber Count Daily Views (Est.)
Feb 23, 2022 Channel Launch 161 <1, 000
Feb 24, 2022 Invasion Begins ~5, 000 ~150, 000
Mar 1, 2022 English Site Launch ~300, 000 12. 5 Million
Mar 9, 2022 Peak Initial Growth 625, 000+ 30. 0 Million

The content strategy for “War on Fakes” focused on immediate denial. When images of the Bucha massacre emerged, the channel released a “fact-check” claiming the bodies were actors, analyzing video artifacts to “prove” movement where there was none. These false debunks were then amplified by the RRN network and Russian diplomatic accounts, creating a feedback loop of fabrication that muddied the waters for casual news consumers.

Industrial- Fabrication: The SDA Leak

The of the RRN operation was not a matter of guesswork. Leaked internal documents from the Social Design Agency (SDA), one of the Russian firms attributed to the campaign by the EU and Meta, reveal a highly bureaucratized production line for disinformation. The documents, verified by European intelligence and investigative consortiums, outline strict quotas for content creation, treating lies as a manufacturing output.

According to the leak, SDA operatives were assigned daily and monthly that included the production of cartoons, memes, and comments. The agency claimed to have produced 39, 899 “content units” in a single reporting period, a figure that demonstrates the industrial capacity of the operation. This was not a loose collection of hackers, a structured corporate entity with KPIs (Key Performance Indicators) tied to the successful pollution of the Western information space.

“Cartoons , 60 units. Memes , 180 units. Article comments , 400.”
, Internal SDA quota document for a single project targeting France and Germany (Source: VSquare/Leaked SDA Files, 2024)

The RRN portal itself, hosted at domains like rrn. world and rrn. media, acted as the polished front for this factory. It published articles in English, French, German, Spanish, Chinese, and Arabic, ensuring global reach. The site’s design mimicked the clean, minimalist aesthetic of modern digital news outlets, complete with “About Us” sections that vaguely promised “objective information” while concealing its links to Structura National Technologies and the Russian state.

The “Matriochka” Evolution

As platforms began to detect and block RRN domains, the campaign evolved into “Operation Matriochka,” a tactic named after Russian nesting dolls. This sub-campaign, identified by French vigilance agency VIGINUM, involved directly challenging Western fact-checkers and journalists on platforms like X (formerly Twitter). Bots would reply to legitimate news posts with RRN links or “War on Fakes” content, asking journalists to “verify” the Russian disinformation, so tricking them into engaging with the content and unwittingly boosting its algorithmic visibility.

This pivot to RRN and “fake fact-checking” represented a maturation of Russian information warfare. It moved beyond the simple deception of a cloned URL to a more complex psychological operation: eroding the very concept of objective truth by appropriating the tools designed to protect it.

Operation Overload: Flooding Fact-Checkers with Spam

The strategic logic of Russian information warfare shifted markedly in late 2023 with the activation of “Operation Overload.” While the Doppelganger protocol focused on cloning media assets to deceive the public, Operation Overload the information ecosystem’s immune system: the fact-checkers, researchers, and journalists responsible for verifying truth. Identified by the Finnish analytics firm Check in June 2024, this campaign represents a “denial of service” attack on human cognition, designed to exhaust the resources of verification organizations by flooding them with fabricated leads.

Between August 2023 and September 2024, operatives linked to the Kremlin sent approximately 71, 000 emails to over 800 organizations across Europe, the United States, and Australia. The campaign operates on a simple yet destructive premise: inundate newsrooms with requests to verify fake content. These emails, frequently subject-lined “Please check” or “Look at this,” contain links to manipulated videos, AI-generated audio, or staged scenes of unrest. The goal is twofold: to force fact-checkers to waste hours debunking non-existent events, and to trick them into publishing “debunking” articles that inadvertently amplify the original Russian narrative to a wider audience.

The Mechanics of Resource Depletion

The operational architecture relies on a technique researchers call “content amalgamation.” Rather than pushing a single fake story, operatives create a, multi- ecosystem of fabrication. A typical attack involves the simultaneous release of a fake news article on a Doppelganger clone site, a staged video on Telegram, and a network of bot accounts on X (formerly Twitter) discussing the “event.” Operatives then email fact-checkers, posing as concerned citizens or anonymous sources, urging them to investigate the “viral” story.

Data from Check and the Institute for Strategic Dialogue (ISD) reveals the of this automation. In the lead-up to the 2024 Paris Olympics, the campaign intensified, targeting French and German outlets with narratives suggesting security failures or terrorism risks. One specific instance involved a fake documentary titled “Olympics Has Fallen,” narrated by an AI-generated voice resembling actor Tom Cruise. The video was distributed via Telegram and then flagged to journalists through thousands of spam emails, demanding verification. By compelling newsrooms to address these absurdities, the attackers successfully diverted editorial attention away from legitimate reporting on the war in Ukraine or Russian domestic affairs.

Table 8. 1: Operation Overload Key Metrics (2023-2024)
Metric Verified Data Source
Total Emails Sent 71, 000+ Check / Reset. Tech
Organizations Targeted 800+ (Media, Fact-Checkers, NGOs) Check
Key Target Countries France, Germany, Ukraine, USA, Australia ISD / Check
Attributed Groups Storm-1679, Storm-1099 (Doppelganger) Microsoft Threat Intelligence
Primary Platforms Email, Telegram, X, Bluesky EU DisinfoLab

Tactical Evolution and Attribution

Technical evidence directly links Operation Overload to the broader Doppelganger infrastructure. Microsoft Threat Intelligence tracks the actors behind this specific spamming activity as “Storm-1679,” noting significant overlap with “Storm-1099,” the group responsible for the Doppelganger media clones. The campaigns share hosting providers, script patterns, and specific “fingerprints” in their video metadata. In September 2024, the operation pivoted toward the U. S. presidential election, with the volume of emails targeting American newsrooms tripling compared to the previous quarter.

The sophistication of the content has also escalated. Early iterations used crude photoshops, 2024 attacks featured high-quality deepfakes and cloned voices of high-profile officials. In one documented case, operatives used AI to mimic the voice of an International Olympic Committee official in a fake recording discussing bribery. This content was then seeded into the inboxes of investigative journalists at major outlets like The Guardian, Der Spiegel, and The New York Times. The attackers anticipate that even if the content is identified as fake, the act of verification consumes valuable time. If a fact-checker ignores the email, they risk missing a “real” story; if they investigate, they fall into the trap.

“The actors aim to introduce their narratives to European audiences using methods. They target fact-checkers and media organisations, prompting them to publish debunks or news stories about these narratives… [It is] a denial of service attack on the fact-checking community.” , Guillaume Kuster, CEO of Check (June 2024)

The “Trojan Horse” of Debunking

A distinct danger of Operation Overload is its attempt to weaponize the debunking process itself. Traditional fact-checking relies on the assumption that correcting a falsehood reduces its harm. Russian strategists have inverted this logic. By creating content so outrageous that it demands a fact-check, they ensure that the narrative enters the mainstream media pattern. A debunking article titled “False: No Terrorist Attack Planned for Paris Olympics” still introduces the concepts of “Terrorist Attack” and “Paris Olympics” into the public consciousness, associating the event with fear. This “information laundering” allows fringe Telegram propaganda to leap into credible news feeds under the guise of correction.

The campaign also exhibits high adaptability. Following the Paris Olympics, the spam flood redirected its focus to the U. S. election, disseminating narratives about electoral fraud and civil unrest. The sheer volume of requests forces organizations to triage, inevitably causing legitimate threats to slip through the cracks while analysts wade through thousands of bad-faith “tips.” This systematic exhaustion of civil society’s defense method marks a permanent evolution in state-sponsored interference, moving from persuasion to paralysis.

The AI Multiplier: Automating Narrative Generation

The operational architecture of the Doppelganger campaign underwent a fundamental shift in late 2022. While earlier Russian influence operations relied on “troll farms”, buildings filled with human operators manually typing comments, Doppelganger introduced an industrialized, automated model. By integrating Large Language Models (LLMs) into their production pipeline, the Social Design Agency (SDA) and Structura National Technologies removed the human bottleneck from disinformation, allowing for the near-instantaneous generation of content across multiple languages.

This automation is not about speed; it is about volume and saturation. Forensic analysis by Recorded Future and Microsoft Threat Analysis Center (MTAC) indicates that the campaign use generative AI to produce thousands of unique articles, social media posts, and comments daily. These are not simple copy-paste jobs. The AI is tasked with rewriting core Kremlin narratives, such as the “ineffectiveness of sanctions” or “NATO aggression”, into hundreds of slight variations, each tailored to the linguistic and cultural nuances of target audiences in Germany, France, and the United States.

The “Hallucination” Feedback Loop

A serious and under-reported aspect of this phase is the weaponization of AI “hallucinations.” The campaign does not just use AI to write fake news; it exploits the way Western AI tools read the internet. By flooding the web with thousands of credible-looking fake news sites (e. g., Boston Times, ChicagoChron. com, San Fran Chron), Doppelganger creates a “data void” that is subsequently filled by its own fabrications.

When users query legitimate AI chatbots about specific topics, these models frequently cite the Doppelganger-created sites as authoritative sources. A June 2024 audit by NewsGuard found that leading AI chatbots repeated Russian disinformation narratives as fact in 32% of test cases, citing these fake local news outlets as their evidence. This creates a self-reinforcing loop: the campaign generates the lie, the fake sites host it, and trusted Western AI tools validate it for the end user.

Table 9. 1: Verified AI-Amplified Disinformation Narratives (2023-2024)
Narrative Theme Origin Source (Fake) AI Amplification method Target Audience
Zelenskyy “High-End” Purchases “My Legacy” Superyacht, “Vuni Palace” Casino Chatbots citing fake real estate/marine logs US, UK, France
NATO Escalation nato. ws (Spoofed Domain) Summarization tools scraping fake press releases Germany, Italy
US Civil Unrest Election Watch, RRN. media AI-generated “news summaries” of non-existent riots United States
Paris Olympics Fear French-language fake local news Predictive text engines warning of specific terror plots France

Automated Comment Swarms

Beyond article generation, the campaign deploys AI to manage engagement. OpenAI’s May 2024 threat report confirmed the termination of accounts linked to Doppelganger that were used to generate comments on social media platforms like X (formerly Twitter) and 9GAG. Unlike previous bot networks that posted identical phrases, these AI-driven accounts analyze the context of a post and generate relevant, argumentative, or supportive replies in flawless local dialects.

This capability allows a single operator to manage a “swarm” of hundreds of distinct personas. In one documented instance involving the “Operation Matriochka” sub-campaign, AI tools were used to debug code for bots that automatically posted on Telegram, streamlining the technical maintenance of the network. The result is a “firehose of falsehood” that is cheaper, faster, and more resilient than any human-powered operation.

“The danger is not just that the AI writes the lie, that it creates the consensus. When a thousand distinct voices all say the same thing within minutes of an event, it breaks the social proof heuristics of the average reader.”

The “Fictitious Journalist” Persona

To lend credibility to these automated articles, the campaign use Generative Adversarial Networks (GANs) to create profile photos for non-existent journalists. These faces, indistinguishable from real humans to the naked eye, are paired with AI-generated biographies and consistent posting histories across multiple platforms. This creates a “depth of identity” that withstands casual scrutiny. An investigation into the Election Watch website revealed that its entire editorial staff was composed of these AI constructs, publishing anti-Biden and anti-Ukraine content that was subsequently amplified by the very algorithms designed to surface “engaging” news.

The integration of these technologies signals a move from “informational noise” to “informational engineering.” The Doppelganger campaign does not rely on the chance that a story go viral; it uses AI to mechanically guarantee that the story is indexed,, and distributed before truth-based verification method can intervene.

Targeting the US: The 2024 Election Interference Vector

By mid-2024, the Doppelganger campaign had executed a strategic pivot, redirecting its primary kinetic capabilities from European to the United States presidential election. This shift was not opportunistic a calculated directive from the Kremlin’s inner circle. Department of Justice (DOJ) filings from September 2024 expose a direct chain of command leading to Sergei Kiriyenko, Deputy Chief of Staff of the Presidential Executive Office. Under his supervision, the Social Design Agency (SDA) and Structura National Technology operationalized a sprawling network of typosquatted domains designed to intercept American voter traffic and inject Kremlin-manufactured narratives directly into the swing-state information ecosystem.

The operational mechanics relied on “cybersquatting” at an industrial. Unlike previous interference efforts that relied heavily on bot farms amplifying existing content, Doppelganger built an alternate reality infrastructure. Russian operatives registered domains such as washingtonpost. pm and fox-news. in, creating pixel-perfect replicas of trusted American news outlets. These clones did not just copy design elements; they mirrored the exact CSS styling, font stacks, and ad placements of their legitimate counterparts. Once established, these sites hosted fabricated articles, frequently written by AI and polished by human editors, that alleged corruption within the Biden administration, exaggerated the migrant emergency at the southern border, or questioned the efficacy of U. S. aid to Ukraine.

Seized Domain (Sept. 2024) Target / Mimicked Entity Operational Function
washingtonpost. pm The Washington Post Hosted fake investigative reports attacking U. S. foreign policy.
fox-news. in Fox News Disseminated fabricated stories on border security and crime.
forward. pw The Forward Targeted Jewish-American audiences with divisive narratives.
50statesoflie. media General US Audience Aggregator for anti-government conspiracy theories.
warfareinsider. us Military/Defense Sector Spread disinformation regarding U. S. military readiness and NATO.

The of content production revealed in leaked internal documents from the Social Design Agency is. Between January and April 2024 alone, the SDA produced approximately 40, 000 distinct pieces of content and generated over 33 million comments across social media platforms to drive traffic to these clones. This “Information Manipulation System” (IMS) functioned as a content mill, where quotas were set for memes, videos, and articles. The objective was not persuasion saturation. By flooding X (formerly Twitter), Facebook, and comment sections with links to these “doppelganger” sites, the operation sought to create a “false consensus” effect, making fringe, pro-Russian viewpoints appear as dominant public sentiment.

Federal action in September 2024 resulted in the seizure of 32 domains, disrupting the immediate infrastructure. Yet, the network demonstrated high resilience. Within 24 hours of the DOJ seizures, researchers at the Digital Forensic Research Lab (DFRLab) identified new domains coming online, utilizing different top-level domains (TLDs) like . co and . cc to replace the lost assets. This “whack-a-mole” indicates that the SDA and Structura operate with redundant infrastructure, treating domains as disposable ammunition rather than permanent assets. The persistence of these nodes proves that the 2024 vector was not a temporary skirmish a sustained siege on the cognitive infrastructure of the American electorate.

The Israeli Front: Weaponizing the Gaza Conflict

Genesis 2022: The Initial Euro-Atlantic Target List
Genesis 2022: The Initial Euro-Atlantic Target List

Following the Hamas attacks on October 7, 2023, the “Doppelganger” network executed a strategic pivot, opening a dedicated front targeting Israeli public opinion and the broader Western perception of the war in Gaza. Unlike previous operations that focused almost exclusively on undermining support for Ukraine, this evolution weaponized the trauma of the Israel-Hamas war to fracture Western alliances. Russian operatives deployed their signature “typosquatting” infrastructure to clone leading Israeli media outlets, injecting fabricated reports designed to sever the diplomatic artery between Tel Aviv, Kyiv, and Washington.

Between November 2023 and March 2024, the network registered dozens of domains mimicking trusted Israeli news sources. Verified included Walla, The Liberal, and Mako. These clones hosted articles written in fluent Hebrew that deviated sharply from the editorial stance of the original publications. For instance, a forged Walla article carried the headline “Shifa Hospital as another defeat in the information war,” while other fabricated pieces argued that the United States was “sacrificing Israel’s security” to maintain its proxy war in Ukraine. The operational goal was twofold: to convince Israelis that Ukraine was a liability and to persuade Western audiences that military aid destined for Kyiv was being diverted to Hamas terrorists.

Cross-Platform Amplification and the “Stars of David” Operation

The campaign’s sophistication was most visible in its integration of digital disinformation with physical acts of vandalism. in November 2023, French authorities identified a coordinated network of over 1, 000 bots affiliated with the Doppelganger infrastructure (specifically the RRN/Recent Reliable News cluster) amplifying images of Stars of David stenciled on buildings in Paris. While the physical graffiti was allegedly commissioned by Russian-linked actors, the digital network instantly weaponized the imagery to fuel narratives of surging antisemitism in Europe, blaming Muslim immigrants and weak Western leadership.

This “hybrid” tactic allowed the network to bypass content moderation filters. Instead of generating purely synthetic text, the bot farm amplified “real” news of the vandalism, which they had secretly orchestrated, to validate their divisive narratives. Data from the French agency VIGINUM confirmed that the bot network began circulating photos of the graffiti nearly 48 hours before they gained organic traction on social media, indicating pre-meditated coordination.

Fabricated Narratives and Targeted Outlets

The Doppelganger network utilized a specific matrix of targeted outlets to disseminate tailored disinformation themes. The following table details verified instances of cloned media brands and the specific false narratives injected into the information ecosystem between late 2023 and 2024.

Table 11. 1: Verified Doppelganger Clones and Disinformation Themes (2023, 2025)
Targeted Outlet (Clone) Target Audience Fabricated Narrative / Headline Strategic Objective
Walla (Israel) Hebrew Speakers “Shifa Hospital as another defeat in the information war” trust in IDF spokesmanship; amplify internal defeatism.
The Washington Post (USA) US / International “Weapons supplies from Ukraine to Hamas have tripled over the past month” Fabricate a link between Ukrainian corruption and Hamas armaments to cut aid to Kyiv.
Le Parisien (France) French Speakers False reports of an AIDS epidemic in Ukrainian Armed Forces Dehumanize Ukrainian soldiers; link support for Ukraine to public health risks.
Mako (Israel) Hebrew Speakers Articles claiming US aid to Israel is contingent on abandoning Ukraine Drive a wedge between Israel and the US; promote Russian diplomatic.
Bild (Germany) German Speakers “The Ukraine conflict does not strengthen NATO, rather exposes problems” Weaken German resolve for NATO; frame the alliance as obsolete amidst the Gaza emergency.

Evolution of Tactics: 2024, 2025

By mid-2024, the campaign’s focus within Israel shifted from general geopolitical disorientation to direct political interference. Analysis of the network’s activity revealed a pivot toward attacking the Israeli government directly. In late 2024, fake accounts began amplifying narratives accusing Prime Minister Benjamin Netanyahu of “betraying the hostages” and prolonging the war for political survival. This shift mirrored the network’s behavior in the United States and Germany, where it opportunistically amplified extreme voices on both sides of the political spectrum to maximize social friction.

The infrastructure supporting these operations remained resilient even with takedowns. In September 2024, the U. S. Department of Justice seized 32 domains used by the network, including washingtonpost. pm and bild. ltd. yet, the network rapidly migrated to new top-level domains (TLDs) and began utilizing redirect chains through legitimate compromised websites to mask the final destination of their propaganda. By 2025, the operation had integrated AI-generated audio and video deepfakes into its arsenal, circulating clips that falsely depicted Israeli officials disparaging American support, further complicating the verification for news consumers.

The French Connection: Anti-Macron Campaigns and Mirage

France has become the primary target of Russian information warfare in Western Europe, a shift driven by President Emmanuel Macron’s increasingly hawkish stance on military aid to Ukraine. The “Doppelganger” apparatus, while global in scope, has deployed specific sub- to French social cohesion and discredit the Élysée Palace. This offensive is not a collection of fake news stories a synchronized “ecosystem of influence” identified by the French vigilance agency VIGINUM. The campaign, frequently codenamed “Portal Kombat” by French intelligence, use a network of at least 193 “information portals” to saturate the French-speaking web with Kremlin-aligned narratives.

The operational architecture relies on a “mirage” strategy, creating an illusion of domestic unrest and policy failure through the mass duplication of legitimate media. Russian operatives have registered typosquatted domains such as leparisien. ltd, lemonde. ltd, and lefigaro. ltd to host fabricated articles. These forgeries are not crude parodies; they replicate the exact CSS styling, font families, and advertising scripts of the target publications. A forensic analysis of the pravda-fr. com node revealed that these sites do not produce original content instead automate the scraping and rewriting of pro-Russian Telegram channels, translating the output into French to polarize public debate.

Targeted Domains and “Portal Kombat” Infrastructure

The “Portal Kombat” network, exposed in February 2024, represents a sophisticated evolution of the Doppelganger technique. Unlike earlier iterations that relied on manual dissemination, this network uses automated “content farms” to publish thousands of articles daily. The infrastructure is linked to the Crimean-based web development company TigerWeb, which provides the technical backbone for these influence operations. The table details the specific French media outlets impersonated and the deceptive domains used to host the forged content.

Table 12. 1: Verified Doppelganger Domains Targeting French Media (2023-2024)
Targeted Outlet Legitimate Domain Fake Doppelganger Domain Primary Disinformation Narrative
Le Monde lemonde. fr lemonde. ltd, lemonde. fr-ltd. news Claims of French economic collapse due to sanctions.
Le Parisien leparisien. fr leparisien. ltd, leparisien. fr-ltd. news Fabricated reports of mass mobilization/conscription.
Le Figaro lefigaro. fr lefigaro. ltd, lefigaro. re Attacks on French military aid and “Mirage” jet ineffectiveness.
20 Minutes 20minutes. fr 20minutes. ltd Fake polls showing 80% opposition to Ukraine support.
French Ministry of Foreign Affairs diplomatie. gouv. fr diplomatie. gouv. fr-ltd. news False travel advisories and “mercenary” death lists.

The “Mirage” Narrative: Discrediting Military Aid

A central pillar of the 2024 anti-Macron offensive focuses on the delivery of Mirage 2000-5 fighter jets to Ukraine. This specific narrative vector, which analysts have termed the “Mirage Disinformation Complex,” seeks to undermine the strategic value of French military hardware. Russian state media and their Doppelganger clones circulated fabricated reports claiming the Mirage jets were “obsolete flying coffins” and that Ukrainian pilots were refusing to fly them. In early 2024, a coordinated wave of fake articles appeared on cloned sites alleging that France was deploying secret squadrons of “mercenaries” to operate these aircraft, a claim directly aimed at provoking Russian domestic support for escalation.

The “Matryoshka” campaign, a parallel operation identified by VIGINUM, complements these narratives by directly harassing French fact-checkers and media organizations. Operatives posing as concerned citizens flood the inboxes of journalists with requests to “verify” fake news about the Mirage jets or President Macron’s personal life. This “disinformation-as-a-service” model aims to exhaust the resources of verification teams, creating a “mirage” of grassroots concern where none exists. The campaign has also targeted the personal reputation of the President, amplifying baseless conspiracy theories regarding Lady Brigitte Macron to the dignity of the presidential office.

“The objective is not to deceive, to exhaust. By forcing French institutions to constantly debunk absurdities, from ‘mercenary’ deaths in Kharkiv to the ‘Mirage’ ineffectiveness, the Kremlin forces a defensive posture that consumes time, money, and political capital.” , VIGINUM Strategic Analysis Report, 2024

Data from the 2024 European elections period indicates a sharp spike in these activities. Between January and June 2024, the volume of anti-French disinformation on X (formerly Twitter) increased by 40%, with a significant concentration of posts originating from accounts previously dormant or repurposed from commercial bot networks. The “Mirage” narratives were timed to coincide with official announcements of military aid, attempting to preemptively neutralize the public relations benefit of French support for Kyiv.

Server-Side Shadows: Hosting Infrastructure and IP Attribution

The “Doppelganger” campaign distinguishes itself not by the volume of its content, by the resilience of its backend architecture. Unlike earlier Russian information operations that relied heavily on ephemeral social media accounts, Doppelganger is anchored in a complex, multi- hosting infrastructure designed to withstand takedowns and attribution attempts. Investigations conducted between 2022 and 2025 reveal a network that hybridizes “bulletproof” hosting services with legitimate European infrastructure, creating a server-side shadow game that complicates mitigation efforts.

At the core of this infrastructure lies the use of Traffic Distribution Systems (TDS) and cloaking software, specifically Keitaro and Kehr. These tools, frequently marketed for affiliate marketing and arbitrage, allow operators to filter incoming traffic based on the user’s IP address, device type, and location. When a user clicks a Doppelganger link on social media, they are not immediately taken to the disinformation site. Instead, they pass through a “redirection maze”, a three-stage process designed to screen out researchers and automated crawlers.

Technical analysis by Qurium and EU DisinfoLab identified the following redirection flow:

Stage 1: The Front Domain. A disposable URL posted on social media (e. g., news-update. com) that hosts innocent-looking content or a blank page to evade platform scanners.
Stage 2: The Filter. A server running Keitaro or Kehr software analyzes the visitor. If the visitor is identified as a bot, a platform moderator, or a user from a non-target country, they are redirected to a benign site. If the visitor is a target (e. g., a German user on a mobile device), they proceed.
Stage 3: The Payload. The user is redirected to the typosquatted disinformation site (e. g., bild. ltd), which serves the fabricated news article.

This architecture relies heavily on specific hosting providers known for their leniency towards abuse complaints. Two entities have emerged as central pillars of this operation: Aeza Group and clear Industries Solutions.

Aeza Group, a Russian provider with a significant European footprint, was identified as a primary hub for Doppelganger’s server operations. even with presenting itself as a legitimate business, Aeza’s network (including AS210352) was found to host not only disinformation assets also infrastructure for ransomware groups and darknet markets like BlackSprut. In July 2025, the U. S. Treasury sanctioned Aeza Group and its founders, Yuri Bozoyan and Arseny Penzev, for their role in facilitating these operations. The sanctions highlighted how Aeza operated “bulletproof” services, ignoring takedown requests and allowing Russian state actors to use servers physically located in Frankfurt and Amsterdam.

Similarly, clear Industries Solutions (later rebranded to the. hosting and linked to WorkTitans BV) provided serious infrastructure resilience. Established just weeks before the 2022 invasion of Ukraine, clear Industries became a go-to provider for DDoS attacks and disinformation hosting. By May 2025, the European Union had sanctioned the company and its owners, the Neculiti brothers, citing their provision of servers that enabled the Doppelganger campaign to its operations across France and Germany.

yet, the campaign also exploited unwitting European providers. Forensic analysis of IP addresses, such as 185. 106. 93. 93 and the range 147. 79. 117. 0/24, showed that Doppelganger operators frequently rented servers from legitimate companies like Hetzner (Germany) and Hostinger (Lithuania). These providers, frequently automated and low-cost, allowed Russian operatives to quickly spin up new “front domains” when old ones were blocked. Hetzner, upon being notified of the abuse by investigative journalists in 2024, purged hundreds of customer accounts linked to the campaign, forcing the operators to migrate to more obscure, bulletproof networks.

Entity Role in Doppelganger Status (as of 2026)
Aeza Group Primary bulletproof hosting hub; hosted TDS and payload sites. Sanctioned by US/UK (July 2025).
clear Industries Solutions Provided resilient infrastructure for scaling operations. Sanctioned by EU (May 2025); rebranded to WorkTitans.
Keitaro / Kehr Traffic Distribution System (TDS) software used for cloaking. Software licenses revoked for identified abuse; illicit use continues.
Structura / SDA Russian IT firms attributed as the operators of the campaign. Sanctioned by EU (2023) and US (2024).

Attribution of these server clusters to the Russian state was achieved through a combination of financial and technical trails. Meta’s security teams and French agency VIGINUM traced the payment records for the hosting services back to Structura National Technologies and the Social Design Agency (SDA), two Moscow-based firms with direct ties to the Kremlin. also, operational security failures by the attackers, such as leaving server status pages publicly accessible, allowed researchers to map the full extent of the network, linking the disinformation sites to the same IP blocks used for other Russian state-sponsored cyberactivities.

The Kyborg Leaks: Inside the Social Design Agency

In September 2024, the unclear of the Doppelganger campaign was laid bare by a massive data breach known as the Kyborg Leaks. Obtained by the Ukrainian hacktivist shared KibOrg (frequently transliterated as Kyborg) and shared with a consortium of European media outlets including Süddeutsche Zeitung and Delfi Estonia, these internal documents provided the forensic look at the administrative backend of Russia’s psychological warfare. The leak exposed the Social Design Agency (SDA), a Moscow-based firm led by political technologist Ilya Gambashidze, as the primary operational hub for the Kremlin’s disinformation architecture.

The documents, which span from late 2022 through early 2024, reveal that the SDA operates not as a rogue hacking group, as a corporate vendor with strict deliverables, budget pattern, and performance reviews monitored directly by the Russian Presidential Administration. The files confirm that Sergei Kiriyenko, the Deputy Chief of Staff to President Vladimir Putin, presided over regular meetings where SDA executives presented their metrics. This direct chain of command obliterates any remaining “plausible deniability” regarding the state’s involvement in the Doppelganger protocol.

One of the most clear is the industrial of the content production. Internal reports from the four months of 2024 alone show that the SDA’s “Russian Digital Army” generated approximately 33. 9 million comments across social media platforms. The agency’s output is governed by rigid quotas rather than organic engagement. Project managers are required to meet daily for memes, videos, and falsified articles, which are then seeded into Western discourse using the typosquatted domains described in previous sections.

SDA Content Production Quotas (Sample Project: “Foreign Media”)
Content Type Monthly Target (Units) Target Platforms Primary Objective
Fabricated Articles 60+ Cloned Sites (Bild, Le Monde) trust in government
Video Memes 180+ TikTok, Instagram Reels Mock Western leaders
Cartoons/Graphics 60+ Facebook, X (Twitter) Visual satire of sanctions
Bot Comments 400, 000+ Social Media Threads Amplify manufactured dissent

The internal culture revealed by the leaks is militaristic yet bureaucratic. In one leaked video presentation intended for Kremlin officials, Gambashidze appears wearing a camouflage hoodie with a patch reading “Russian Ideological Troops,” styling himself as a “commander of special forces” in the information war. even with this theatricality, the documents show a fixation on western corporate metrics. The SDA tracks “Key Performance Indicators” (KPIs) such as Engagement Rate (ER) and “Opportunity To See” (OTS), a standard advertising metric used to estimate the number of times a viewer is likely to be exposed to a message. These metrics determine the agency’s funding, incentivizing the team to their numbers through automated bot farms when organic traction fails.

The “Metodichka,” or narrative manual found within the files, outlines specific emotional triggers the operatives must exploit. For audiences in Germany and France, the directive is to stoke fear regarding economic decline and energy absence. For American audiences, the focus shifts to isolationism and the cost of aid to Ukraine. The documents explicitly instruct operatives to “exacerbate internal contradictions” within target nations. One specific campaign, internally codenamed “Project Matryoshka,” involved a multi- method where fake accounts would query legitimate media outlets about non-existent scandals, creating a feedback loop of artificial curiosity that the SDA would then “satisfy” with forged evidence.

Financial records included in the leak indicate that the operation is well-funded cost- compared to traditional military hardware. While specific budget totals for the entire Doppelganger apparatus remain fragmented, individual campaign sheets show monthly expenditures for social media promotion (paid ads on Meta and X) running into the hundreds of thousands of dollars. The SDA and its partner firm, Structura National Technologies, allocated significant resources to “cloaking” services, technical infrastructure designed to hide the Russian origin of the traffic from Western platform moderators. This investment in obfuscation technology confirms that the evasion of content moderation is a line item in their budget, not an afterthought.

The Kyborg Leaks demystify the Doppelganger campaign, stripping away the aura of sophisticated hacking to reveal a mundane, quota-driven office environment. The “trolls” are salaried employees filling out Excel spreadsheets, reporting to middle managers who in turn present PowerPoint slides to the Kremlin. This bureaucratic banality makes the operation’s output no less dangerous; rather, it proves that disinformation is a permanent, institutionalized function of the Russian state apparatus.

Redirect Chains: The Technical Mechanics of Traffic Obfuscation

The “Doppelganger” campaign relies on a sophisticated, multi- traffic distribution architecture designed to evade automated moderation while delivering targeted disinformation to specific European audiences. Forensic analysis by Qurium and the EU DisinfoLab in 2024 identified this structure as the “F-I-KE-D” protocol, a four-stage redirection chain that filters users based on their digital fingerprint. This method ensures that while a target in Berlin or Paris sees a fabricated Der Spiegel or Le Monde article, a content moderator in California or a web crawler receives a benign 404 error or a decoy page.

The operational logic of this traffic flow is strictly hierarchical, prioritizing the survival of the final destination domains over the expendable entry points. The chain functions as follows:

Table 15. 1: The F-I-KE-D Redirection Architecture
Stage Component Function Lifespan
F Front Domains Expendable URLs shared on social media (e. g., X, Facebook) to initiate the click. Hours to Days
I Intermediary Domains Silent redirectors that obfuscate the link’s origin and strip referer data. Weeks
KE Keitaro TDS Traffic Distribution System server that filters users by IP, device, and OS. Months
D Doppelganger Domains The final destination hosting the cloned media site (e. g., bild. ltd). Persistent

At the core of this infrastructure is the Keitaro Traffic Distribution System (TDS), a commercial-grade marketing tool repurposed for information warfare. Russian operators use Keitaro to implement granular “cloaking” rules. Between 2023 and 2025, investigators observed that these servers were configured to block traffic from known IP ranges associated with Meta, Google, and security vendors like Spamhaus. If a request originates from a flagged IP or a non-target region (such as the United States during a Germany-focused campaign), the TDS terminates the connection or redirects to a harmless “filler” site. This geofencing technique explains why platform moderators frequently fail to identify the malicious nature of reported links, as they are technically barred from witnessing the disinformation payload.

The of this infrastructure is industrial. In a single monitored period between May and July 2024, the campaign generated over 147 unique “Front” domains using a cloaking service identified as “Kehr.” These domains were hosted on “bulletproof” networks, specifically utilizing providers like Aeza International and Lethost, which operate within European data centers maintain unclear ownership structures linked to Russian entities. By placing the initial redirect nodes on European soil, the operators reduce latency and bypass preliminary geo-blocks that might flag traffic originating directly from Russia.

This technical obfuscation creates a persistent game of “whack-a-mole” for defenders. When a social media platform blacklists a “Front” domain, the operators simply discard it and register a new batch, while the valuable “Doppelganger” destination domains remain untouched and active. The “Intermediary” and “Keitaro” nodes act as air gaps, preventing the toxicity of the spammy entry links from contaminating the reputation of the final cloned news sites. Data from late 2024 indicates that even with sanctions and public exposure, the rotation speed of these redirect chains has only increased, with Front domains remaining active for less than 24 hours to minimize their forensic footprint.

Botnet: Burner Accounts and Automated Amplification

Digital Forensics: DNS Spoofing and Typosquatting Metrics
Digital Forensics: DNS Spoofing and Typosquatting Metrics

The operational logic of the Doppelganger campaign marks a departure from the persona-based influence operations of the mid-2010s. Unlike the Internet Research Agency’s earlier efforts to cultivate “influencers” with years of backstory, Doppelganger relies on a high-velocity, high-attrition model. The campaign treats social media accounts as munitions rather than assets: cheap, mass-produced, and instantly disposable. This “burner” philosophy allows Russian operators to overwhelm platform moderation systems through sheer volume, accepting the rapid deletion of thousands of accounts as a calculated operational cost.

Data from 2023 and 2024 reveals the industrial of this automation. Between December 2023 and January 2024 alone, German Foreign Office analysts identified over 50, 000 inauthentic X (formerly Twitter) accounts linked to the campaign. These accounts shared generated 1. 8 million German-language posts in just six weeks. The primary function of these bots is not to engage in debate to act as a delivery system for links to cloned media sites. Once a link is seeded, the accounts are frequently abandoned or suspended, only to be replaced by a fresh wave of automated registrations within hours.

The campaign employs a bifurcated structure on X to maximize reach before detection. “Content accounts” originate the primary disinformation posts, while a larger swarm of “amplification accounts” instantly reposts, likes, and replies to this content to manipulate platform algorithms. This division of labor protects the core distribution nodes while sacrificing the lower-value amplification bots. In May 2024, forensic analysis showed that on peak days, this network posted more than one tweet per second, a rate of fire that physically precludes human operation.

Table 16. 1: Doppelganger Automation Metrics (Selected Periods 2023-2024)
Metric Volume / Rate Platform Source
Identified Bot Accounts 50, 000+ X (Twitter) German Foreign Office (2024)
Total Automated Posts 1. 8 Million (6 weeks) X (Twitter) German Foreign Office (2024)
Ad Reach 38 Million Accounts Meta (Facebook/Instagram) AI Forensics (2024)
Content Production Rate 1 Article / 50 Minutes Cloned Websites EU DisinfoLab / Viginum
Ad Volume 3, 826+ Ads Meta (Facebook/Instagram) AI Forensics (2024)

On Meta platforms, the tactics shift from swarming to “hit-and-run” advertising. Instead of maintaining persistent pages, Doppelganger operators use “burner” accounts to purchase thousands of low-cost advertisements. A 2024 investigation by AI Forensics found that between August 2023 and March 2024, the campaign ran over 3, 826 ads targeting French and German users. These ads, frequently paid for with disposable credit cards or compromised accounts, reached over 38 million users. The accounts responsible for these ads frequently post only once before going dormant or being banned, rendering traditional reputation-based moderation tools ineffective.

The technical infrastructure supporting these burner accounts involves complex obfuscation. Links shared by the botnets rarely point directly to the fake news sites. Instead, they pass through a chain of “keitaro” traffic distribution systems, intermediary servers that filter out bots, researchers, and platform crawlers. If the system detects a non-human visitor (like a Facebook moderation bot), it redirects to a benign page, such as a cooking blog or a generic 404 error. Only real users are funneled to the disinformation content. This “cloaking” technique extends the lifespan of the burner accounts by delaying the automated flagging of malicious links.

Recidivism remains a central feature of the network. Following a major takedown by X in early 2024, the campaign successfully regenerated tens of thousands of accounts within weeks. This resilience is powered by generative AI, which automates the creation of profile pictures, biographies, and initial “filler” content to bypass spam filters. The content generation itself is similarly automated; the network publishes a new disinformation article approximately every 50 minutes, a pace maintained by AI tools that scrape, rewrite, and translate content from legitimate Russian sources into target languages.

The effectiveness of this burner lies in its asymmetry. Platforms must expend significant resources to identify and remove networks, while the cost for operators to spin up new instances is negligible. By flooding the zone with disposable accounts, Doppelganger forces defenders into a permanent game of “whack-a-mole,” where the time to detection frequently exceeds the time required for a disinformation narrative to gain initial traction. The sheer volume of noise created by these automated systems serves a secondary purpose: it artificially the perceived popularity of fringe narratives, tricking trending algorithms and casual readers into believing a manufactured consensus exists.

Federal Intervention: The DOJ Domain Seizures of 2024

On September 4, 2024, the United States Department of Justice executed a decisive strike against the “Doppelganger” infrastructure, seizing 32 internet domains used by Russian state-sponsored actors to wage information warfare against American voters. This operation, authorized by the U. S. District Court for the Eastern District of Pennsylvania, exposed the direct operational link between the Kremlin and the private firms tasked with cloning Western media. The unsealed affidavit identified two Russian companies, Social Design Agency (SDA) and Structura National Technologies, as the primary architects of this network, operating under the direct supervision of Sergei Kiriyenko, Deputy Chief of Staff of the Russian Presidential Executive Office.

The federal intervention revealed that these entities did not host propaganda; they built a sophisticated technical apparatus designed to deceive users through “cybersquatting.” By registering domains that visually and phonetically mimicked legitimate news organizations, Russian operatives created a “hall of mirrors” where fabricated stories appeared to carry the authority of established journalism. The seized domains were not random; they were calculated forgeries targeting specific trusted outlets in the United States, France, and Germany. For instance, the operation utilized washingtonpost. pm to impersonate The Washington Post, populating the site with anti-Ukraine narratives and alarmist content regarding U. S. border security.

Operational Mechanics and Attribution

The Justice Department’s filings provided a rare, granular look at the command structure behind Doppelganger. Unlike previous decentralized bot campaigns, this operation was run with corporate precision. SDA, led by Ilya Gambashidze, functioned as the content creation hub, generating the articles and memes, while Structura, led by Nikolai Tupikin, managed the technical infrastructure, including server acquisition and domain registration. The affidavit detailed how these firms used U. S.-based domain registrars to purchase the URLs, frequently employing anonymized payment methods to obscure their origin. Of the 32 seized domains, 14 were registered through Namecheap, with others spread across GoDaddy and NameSilo, exploiting the automated nature of commercial domain registration.

The seized assets fell into two distinct categories: direct spoofs of major media outlets and “unique media brands” created to serve as aggregators for disinformation. The table outlines verified examples of the domains seized during the September 2024 operation.

Table 17. 1: Verified Seized Doppelganger Domains (September 4, 2024)
Targeted/Spoofed Entity Seized Domain Name Primary Disinformation Narrative
The Washington Post washingtonpost. pm Criticism of U. S. aid to Ukraine; border emergency exaggeration.
Le Monde (France) lemonde. ltd Anti-Macron sentiment; fabrication of French economic collapse.
Le Parisien (France) leparisien. ltd Social unrest exaggeration; anti-NATO messaging.
Bild (Germany) bild. ltd Energy emergency fear-mongering; anti-Green party rhetoric.
Standalone Brand shadowwatch. us Conspiracy theories regarding U. S. intelligence operations.
Standalone Brand truthgate. us Deep state narratives; election integrity doubts.
Standalone Brand warfareinsider. us False reports on U. S. military readiness and NATO failures.

Concurrent Financial Indictments

While the domain seizures dismantled the technical infrastructure of Doppelganger, a simultaneous DOJ action illuminated the financial of Russia’s broader influence operations. On the same day, federal prosecutors unsealed an indictment against two employees of RT (formerly Russia Today), Kostiantyn Kalashnikov and Elena Afanasyeva. They were charged with funneling nearly $10 million through a network of shell companies to a Tennessee-based content creation firm, identified in court documents as Tenet Media. Although legally distinct from the SDA/Structura domain seizures, this indictment confirmed that the Kremlin was to spend millions to launder its narratives through unwitting North American influencers, “gray-washing” state propaganda into domestic political commentary.

Resilience and Adaptation

The seizure of 32 domains disrupted the immediate distribution of Doppelganger content, yet the network demonstrated rapid resilience. Forensic analysis by the Atlantic Council’s DFRLab confirmed that within 24 hours of the DOJ announcement, new clones emerged using different top-level domains (TLDs) such as . cc, . co, and . pw. This “whack-a-mole” shows that while legal interventions can impose costs and friction, the underlying infrastructure of the Doppelganger protocol remains modular and easily replicable. The operation’s ability to migrate instantly suggests that SDA and Structura maintain a reserve of dormant domains and server capacity, treating individual seizures as a calculated cost of doing business rather than a terminal blow.

The Treasury’s Hammer: the Financial Infrastructure

The United States government shifted its counter-offensive strategy against the “Doppelganger” campaign from mere exposure to direct economic warfare in 2024. On March 20, 2024, the Department of the Treasury’s Office of Foreign Assets Control (OFAC) two primary Russian entities, Social Design Agency (SDA) and Structura National Technologies (Structura), alongside their respective leaders, Ilya Andreevich Gambashidze and Nikolai Aleksandrovich Tupikin. These designations, executed under Executive Order 14024, targeted the commercial backbone of the Kremlin’s disinformation apparatus, freezing U. S. assets and criminalizing financial interactions with these operatives.

Treasury investigations revealed that SDA and Structura operated not as rogue hacktivists as contracted service providers for the Russian Presidential Administration. Gambashidze and Tupikin managed a sprawling network of over 60 typosquatted domains designed to impersonate legitimate European news outlets. The sanctions severed their access to western financial markets, complicating their ability to purchase web hosting, domain registration services, and paid social media amplification. This action marked the time the U. S. government explicitly targeted the private-sector contractors responsible for the technical implementation of the Doppelganger protocol.

Sanctioned Operatives and Entities (2024)

Date Entity / Individual Role / Affiliation Legal Authority
March 20, 2024 Social Design Agency (SDA) Primary content creation firm for Doppelganger E. O. 14024
March 20, 2024 Structura National Technologies Technical infrastructure and domain management E. O. 14024
March 20, 2024 Ilya Gambashidze Founder of SDA; Campaign architect E. O. 14024
March 20, 2024 Nikolai Tupikin CEO of Structura; Technical lead E. O. 14024
Sept 4, 2024 ANO Dialog Non-profit leveraging AI for disinformation E. O. 14024
Sept 4, 2024 Vladimir Tabak Director General of ANO Dialog E. O. 14024

The campaign against these actors escalated significantly on September 4, 2024. OFAC expanded its target list to include “ANO Dialog,” a Russian non-profit organization, and its subsidiary “ANO Dialog Regions.” The Treasury identified ANO Dialog as a key player in leveraging artificial intelligence to generate disinformation content for the Doppelganger network. Vladimir Grigoryevich Tabak, the Director General of ANO Dialog, was personally sanctioned for his role in coordinating these efforts with senior Russian government officials. This wave of sanctions coincided with the Department of Justice seizing 32 internet domains used by these actors to conduct foreign malign influence campaigns, physically a portion of the infrastructure while the Treasury attacked the financing.

Financial forensics linked to the March designations exposed the use of cryptocurrency to fund these operations. OFAC identified specific TRON (TRX) wallet addresses associated with Ilya Gambashidze, which were used to pay for infrastructure costs. Between April 2022 and March 2024, one of these wallets received the majority of its funds from Garantex, a sanctioned Russian exchange. This transactional data provided the evidentiary link proving that the Doppelganger campaign was not a grassroots movement a funded, state-directed operation with a clear money trail leading back to Moscow.

Visualizing the September 2024 Crackdown

Scope of U. S. Government Actions (Sept 4, 2024)

Individuals Sanctioned
10
Entities Sanctioned
2
Domains Seized
32

Data Source: U. S. Department of the Treasury & Department of Justice

The coordination between the Treasury and the Department of Justice demonstrates a “whole-of-government” method to counter-foreign influence. While the DOJ utilized civil forfeiture laws to seize the domain names `washingtonpost. pm` and `bild. ltd`, the Treasury’s actions ensured that the individuals behind the keyboards could no longer interact with the U. S. financial system. This dual pressure forces Russian operatives to constantly migrate their infrastructure to less reliable, non-Western service providers, increasing their operational costs and technical friction. The September 4 actions specifically highlighted the role of RT executives in recruiting unwitting American influencers, further widening the scope of sanctions to include those who attempt to launder disinformation through authentic Western voices.

The Shift from Observation to Enforcement

The European Union’s response to the Doppelganger campaign marked a definitive transition from passive monitoring to active legal enforcement under the Digital Services Act (DSA). By late 2023, the European Commission ceased treating Russian disinformation as a diplomatic irritant and began classifying it as a widespread risk requiring immediate mitigation by Very Large Online Platforms (VLOPs). This pivot was formalized in December 2023, when the Commission opened its formal proceedings against X (formerly Twitter), citing specific failures to counter illegal content and information manipulation.

The enforcement strategy the algorithmic amplification of Doppelganger content rather than solely focusing on individual takedowns. Under the DSA, platforms with over 45 million monthly active users must prove they are actively mitigating risks to civic discourse. The Commission’s investigations revealed that major platforms had failed to adequately police the “typosquatting” infrastructure central to the Doppelganger protocol, allowing cloned domains to circulate widely before detection.

Formal Proceedings and Preliminary Findings

In 2024, the Commission escalated its actions, launching formal non-compliance investigations against Meta (Facebook/Instagram) and TikTok. These proceedings moved beyond general warnings to address specific technical failures that facilitated the Doppelganger operation.

Table 19. 1: Key DSA Enforcement Actions Linked to Russian Disinformation (2023-2025)
Platform Date Proceedings Opened Specific Allegations Related to Disinformation Status (as of Dec 2025)
X (Twitter) December 18, 2023 Failure to mitigate information manipulation; deceptive “Blue Check” design; absence of ad transparency. Preliminary findings of non-compliance issued July 2024.
Meta (FB/Insta) April 30, 2024 Ineffective handling of deceptive ads; insufficient data access for researchers; “Doppelganger” amplification. Preliminary findings of transparency breach issued Oct 2025.
TikTok February 19, 2024 widespread risks to electoral processes; absence of functional repository for ads used in influence campaigns. Investigation ongoing; “Lite” reward feature suspended in EU.

On July 12, 2024, the Commission issued preliminary findings against X, stating that its “verified” account system, which Doppelganger operatives used to purchase credibility, deceived users and violated DSA transparency rules. The investigation found that the platform’s removal of the “Blue Check” verification criteria allowed Russian operatives to buy verification for bot accounts, which then amplified links to cloned media sites like bild. ltd and washingtonpost. pm.

Similarly, the proceedings against Meta, opened in April 2024, explicitly referenced the “Doppelganger” campaign. The Commission the “incessant creation” of fake accounts and the platform’s inability to stop paid advertisements from promoting pro-Kremlin narratives. Data from EU DisinfoLab and VIGINUM (France’s agency against foreign digital interference) supported these charges, identifying over 31, 000 fake articles hosted on the RRN (Recent Reliable News) platform alone as of September 2024.

Sanctions and Asset Freezes

The Architects: Social Design Agency and Structura National Technologies
The Architects: Social Design Agency and Structura National Technologies

Parallel to platform regulation, the EU deployed targeted sanctions against the operational architects of Doppelganger. In July 2023, the Council of the European Union sanctioned the Social Design Agency (SDA) and Structura National Technologies, the two Russian IT firms identified as the technical backbone of the campaign. These sanctions froze the assets of key individuals, including Ilya Gambashidze, the founder of SDA, and prohibited EU companies from doing business with them.

The financial pressure intensified in 2024. Investigations revealed that Doppelganger operatives had used European server infrastructure to host their cloned sites. In response, the EU expanded its sanctions regime to include “internet access services” that knowingly facilitated these operations. This legal maneuver forced European hosting providers to terminate contracts with shell companies masking SDA activities, disrupting the campaign’s ability to maintain stable URLs.

Data Access and Researcher

A serious component of the DSA enforcement involves Article 40, which mandates that VLOPs provide vetted researchers with access to platform data. The Commission’s October 2025 preliminary findings against Meta and TikTok highlighted their failure to comply with this requirement. By restricting access to tools like CrowdTangle, platforms had blinded independent watchdogs to the of the Doppelganger operation.

VIGINUM’s analysis demonstrated the need of this data. Their April 2024 report, which relied on open-source intelligence, exposed a network of over 1, 000 bots on X dedicated to amplifying RRN content. The Commission that without full API access, the true reach of such networks remains undercounted, preventing an accurate assessment of the “widespread risk” to European democracy.

Narrative Clusters: Mapping the Anti-Ukraine Sentiment Graph

Between 2022 and 2025, the Doppelganger campaign abandoned the chaotic, scattershot method of earlier Russian information operations in favor of a highly structured “sentiment graph.” This method did not flood the zone with noise; it segmented Western audiences into specific psychographic clusters, feeding each a tailored version of anti-Ukraine reality. By 2024, forensic analysis by Viginum and EU DisinfoLab identified three primary narrative super-clusters that operated with industrial synchronization across the cloned media ecosystem.

The and most voluminous cluster, the “Western Suicide” narrative, targeted conservative and working-class demographics in Germany, France, and Italy. This track argued that sanctions against Russia were an act of economic self-immolation. Data from the Bavarian Office for the Protection of the Constitution revealed that between May 2023 and July 2024, over 30% of the 7, 983 identified Doppelganger campaigns specifically targeted German audiences with this message. These articles, hosted on clones of Der Spiegel and Bild, frequently correlated spikes in local energy prices directly with aid packages to Kyiv, presenting a zero-sum game where a warm German home required a cold Ukrainian shoulder.

A second, more aggressive cluster focused on the “Black Hole” theory, designed to trust in institutional oversight. This narrative thread posited that Ukraine was not a victim of aggression a sinkhole of corruption where Western weaponry into the black market. In late 2023 and early 2024, this cluster produced sophisticated forgeries, including a fake Le Point article alleging a coup attempt by General Valery Zaluzhny against President Zelensky. The “Reliable Recent News” (RRN) portal served as the central node for this content, aggregating these fabrications under the guise of fact-checking. RRN traffic data from late 2024 showed a pivot toward US audiences, amplifying claims that FEMA funds for Hurricane Milton and Helene victims were being diverted to the Ukrainian front lines.

Table 1: Primary Anti-Ukraine Narrative Clusters (2023-2025)
Narrative Cluster Target Demographic Core Message Verified Reach / Volume
Economic Ruin DE, FR, IT (Working Class) “Sanctions hurt you more than Russia.” 250, 061 clicks (Germany, May ’23-July ’24)
The Black Hole US, UK (Taxpayers) “Your money is being stolen/laundered.” 1, 366 X posts, 4. 66M views (June 2024)
Social Contagion PL, FR (Farmers/Rural) “Ukrainian imports destroy local livelihood.” 170+ targeted ads (Jan-Feb 2024)

The third cluster, identified as “Social Contagion,” exploited hyper-local grievances to wedge Ukraine into domestic disputes. This tactic was most visible during the European farmers’ protests of early 2024. Doppelganger operatives purchased over 170 advertisements on Meta platforms in January and February 2024 alone, specifically targeting French and Polish agricultural communities. These ads did not discuss the war’s morality focused entirely on the economic threat of cheap Ukrainian grain and chicken imports. By framing Ukrainian solidarity as a direct threat to the livelihood of European farmers, the campaign successfully moved anti-Ukraine sentiment from the geopolitical abstract to the kitchen table.

“The architecture of these narratives is not ideological functional. They do not need you to love Russia; they only need you to resent the cost of supporting Ukraine. The ‘Social Contagion’ cluster proves they can weaponize the price of chicken just as as the threat of nuclear war.”

The operational discipline behind these clusters suggests a centralized command structure. Meta’s adversarial threat report from May 2024 noted that while the “smash-and-grab” tactics of domain registration remained crude, the narrative discipline tightened. When the “Western Suicide” narrative failed to gain traction in the United States, the network rapidly pivoted to the “Black Hole” narrative, specifically weaponizing the U. S. border emergency. Internal Kremlin documents obtained by European intelligence services in 2024 confirmed this adaptability, showing instructions to “cultivate an environment” where Americans viewed border security and Ukraine aid as mutually exclusive financial choices.

This segmentation allows Doppelganger to maintain contradictory truths simultaneously. To a far-left audience in Berlin, the war is portrayed as American imperialism sacrificing Ukrainian lives; to a far-right audience in Texas, it is portrayed as a globalist money-laundering scheme. The “Reliable Recent News” network acts as the clearinghouse, ensuring that while the stories differ, the downstream effect, paralysis of Western support, remains uniform. By late 2025, the volume of these clustered narratives on X (formerly Twitter) had stabilized, their integration into authentic discourse loops made them increasingly difficult to disentangle from organic political dissent.

The False Fact-Checker: Fake Verification Portals

The evolution of the Doppelganger protocol has birthed a sophisticated sub-species of disinformation: the counterfeit verification portal. Moving beyond simple media cloning, Russian state-backed operators have weaponized the aesthetics of open-source intelligence (OSINT) and fact-checking to insulate their narratives from scrutiny. This tactic relies on “authority bias,” where audiences instinctively trust formats that appear to debunk falsehoods, even when those “debunkings” are themselves fabricated.

The primary engine of this strategy is WarOnFakes. com (and its associated Telegram channels), which launched on February 24, 2022, the same day as the full- invasion of Ukraine. Unlike traditional propaganda outlets, War on Fakes mimics the visual language of Western fact-checkers, using “False” and “True” stamps, granular video analysis, and pseudo-forensic language. Within two weeks of its launch, the project amassed over 700, 000 subscribers, a growth rate that researchers at the Atlantic Council’s DFRLab attributed to coordinated amplification by Russian state media and diplomatic accounts.

War on Fakes established the template for “defensive disinformation.” Its operatives systematically labeled real documentation of Russian atrocities as “staged productions.” In April 2022, when images of executed civilians emerged from Bucha, War on Fakes immediately published a “debunk” claiming the bodies were emergency actors who moved their limbs, a claim later disproven by high-resolution satellite imagery widely circulated by Russian embassies in the UK, France, and Geneva.

Verified Russian Fake Verification Portals (2022, 2025)
Portal Name Launch / Active Operator / Link Primary Tactic
War on Fakes Feb 2022 ANO Dialog / Kremlin-linked “Debunking” real war crimes as “staged” events.
RRN (Reliable Recent News) Summer 2022 Social Design Agency (SDA) Mixing fake fact-checks with cloned Western media reports.
Global Fact-Checking Network (GFCN) Nov 2024 (Announced) TASS / ANO Dialog Institutional mimicry of the International Fact-Checking Network (IFCN).

The strategy evolved significantly with the integration of these portals into the wider Doppelganger infrastructure. The “Reliable Recent News” (RRN) portal, originally “Reliable Russian News,” serves as a central repository for the campaign’s content. Investigations by EU DisinfoLab and Meta linked RRN to the Russian IT firms Struktura and Social Design Agency (SDA). RRN does not publish fake news; it publishes “verifications” of fake news, creating a closed loop where a Doppelganger-created forgery (such as a fake Der Spiegel cover) is “verified” by RRN as authentic, then amplified by bot networks.

The most ambitious escalation occurred in late 2024 with the unveiling of the Global Fact-Checking Network (GFCN). Presented at the “Dialogue about Fakes 2. 0” forum in Moscow in November 2024 and fully active by April 2025, this initiative represents a shift from tactical deception to strategic institutional mimicry. Co-founded by the state news agency TASS and the sanctioned entity ANO Dialog, the GFCN was explicitly framed by Russian Foreign Ministry spokesperson Maria Zakharova as a counterweight to Western “pseudo-fact-checking.”

The GFCN attempts to rival the legitimate International Fact-Checking Network (IFCN) by adopting a nearly identical acronym and recruiting “experts” from friendly nations to validate Kremlin narratives. This “zombie institution” allows Russian operatives to launder disinformation through a body that possesses the trappings of international oversight. By 2025, the network had begun publishing “investigations” that mirrored the formatting of legitimate outlets like Bellingcat, specifically targeting narratives around Western sanctions and military aid to Ukraine.

This “hall of mirrors” effect creates a paralysis of truth. When a user encounters a claim, they also encounter a pre-fabricated “fact-check” that denies it. The goal is not necessarily to convince the audience of a specific lie, to degrade the concept of verification itself, making it impossible for an average user to distinguish between a legitimate debunking and a state-sponsored forgery.

Cross-Platform Migration: From Facebook to X and Telegram

The operational logic of the Doppelganger campaign shifted markedly between late 2022 and 2024. While the initial infrastructure relied on Facebook’s advertising tools to seed disinformation, aggressive takedowns by Meta forced Russian operators to diversify their distribution channels. The campaign did not stop; it moved. This migration was not a retreat a tactical adaptation to the changing regulatory and technical environments of Western social media platforms. The operators identified X (formerly Twitter) and Telegram as more permissive environments for their “burner” account strategy, exploiting the of moderation standards on one and the inherent privacy of the other.

Meta’s adversarial threat reports from 2023 indicate a massive volume of activity, with the company removing over 3, 800 pages and groups targeting France and Germany alone. These pages reached an estimated 38 million accounts before suspension. The high attrition rate on Facebook made the platform expensive and labor-intensive for the Doppelganger operators. In response, they pivoted to X, where the removal of verification guardrails under Elon Musk’s ownership provided a new opening. Russian operatives began purchasing “Blue” checkmarks for bot accounts, a tactic identified by the Washington Post in February 2023. These paid verifications allowed fraudulent accounts to masquerade as legitimate news sources or verified individuals, bypassing the platform’s algorithmic filters that previously suppressed unverified, low-quality content.

The of this activity on X became clear during a six-week investigation by the German Federal Foreign Office. Between December 2023 and January 2024, analysts identified more than 50, 000 inauthentic accounts disseminating over 1. 8 million posts. These accounts operated in coordinated swarms, replying to real users with automated scripts and hijacking trending hashtags to inject pro-Kremlin narratives into unrelated conversations. Unlike the Facebook strategy, which relied heavily on paid ads to generate initial traffic, the X strategy utilized the platform’s recommendation algorithm to amplify content organically through sheer volume and engagement manipulation.

Telegram serves a distinct function within this ecosystem. It acts as the “backend” or content repository for the campaign. Channels such as “Recent Reliable News” (RRN) and “WarOnFakes” function as staging grounds where disinformation is created, tested, and stored. Operatives then disseminate links to these Telegram posts across X and Facebook, frequently using complex redirect chains to disguise the source. This method protects the core content from takedowns; even if the social media posts are removed, the original material remains accessible on Telegram, ready to be reshared by a fresh wave of bot accounts. The platform’s absence of algorithmic moderation allows these repositories to grow unchecked, serving as a permanent library of forgeries.

The technical sophistication of the redirect method also evolved during this migration. Operators began using services like Kehr. io and legitimate cloud hosting providers to mask the final destination of the links shared on social media. A user clicking a link on X might pass through three or four intermediate servers before landing on a cloned news site. This “geofencing” technique allows the campaign to filter out researchers and automated scanners, showing the disinformation only to users with specific IP addresses or browser fingerprints. This level of obfuscation complicates attribution and delays the blocking of malicious domains.

Table 22. 1: Operational Metrics by Platform (2023-2024)
Metric Meta (Facebook/Instagram) X (formerly Twitter) Telegram
Primary Tactic Paid Ads & Fake Pages Paid Verification & Bot Swarms Content Hosting & Staging
Volume (Sample) 3, 800+ pages removed (Aug ’23, Mar ’24) 1. 8 million posts in 6 weeks (Dec ’23, Jan ’24) Permanent channels (RRN, etc.)
Reach/Exposure ~38 million accounts High visibility via “For You” feed Hub for cross-platform seeding
Moderation Response Automated takedowns & reporting Reactive, frequently requires media pressure Minimal to non-existent
Cost to Attacker High (Ad spend + account creation) Moderate ($8/month per verified bot) Low (Free channel creation)

The migration to X also coincided with a shift in content focus. While Facebook ads targeted broad demographics with sensationalist headlines, the X botnets engaged in direct harassment of journalists, politicians, and fact-checkers. This “swarming” tactic aims to silence opposition through intimidation and noise, drowning out factual reporting with a flood of automated replies. The integration of AI-generated comments has made these bots harder to distinguish from real users, as they can generate context-relevant text rather than simply repeating identical slogans. This evolution marks a transition from passive broadcasting to active, aggressive engagement in the public square.

The interplay between these platforms creates a resilient disinformation supply chain. Telegram provides the safe harbor for content, X provides the amplification and harassment capability, and Facebook, even with its stricter controls, remains a target for paid reach. The campaign’s ability to fluidly move assets and tactics between these ecosystems shows a high degree of operational agility, rendering single-platform countermeasures largely ineffective.

Audience Penetration: Measuring Authentic vs Inorganic Reach

The operational logic of the Doppelganger campaign prioritizes volume over precision, functioning as a digital war of attrition rather than a targeted surgical strike. While the sheer of the operation suggests a pervasive threat, a forensic analysis of engagement metrics reveals a clear between inorganic reach (automated views, bot impressions, paid delivery) and authentic resonance (genuine human interaction, shares, and clicks). Data verified between 2023 and 2025 indicates that while the campaign successfully pollutes the information ecosystem, its ability to convert passive impressions into active belief remains statistically low.

The “Spray and Pray” architecture of Doppelganger is best illustrated by its click-through rates (CTR). According to a 2024 analysis by EU DisinfoLab, the operation launched 7, 983 distinct dissemination campaigns between May 2023 and July 2024. even with this massive output, these campaigns generated only 828, 842 total clicks, an average of 103 clicks per campaign. This metric exposes the campaign’s fundamental weakness: it relies on brute-force flooding of social feeds rather than persuasive organic traction. The vast majority of “engagement” is likely internal, driven by bot networks amplifying their own noise to trick platform algorithms into trending the content.

The “Hollow Echo” Effect: Reach vs. Engagement

Profile: Ilya Gambashidze and the Kremlin Contract
Profile: Ilya Gambashidze and the Kremlin Contract

The between reach and engagement is not accidental a feature of the “burnout” strategy employed by Russian operatives. Meta’s adversarial threat reports from late 2023 and early 2024 identified that Doppelganger assets purchased approximately $105, 000 in advertisements to force content into the feeds of French and German users. These paid injections resulted in a chance reach of over 38 million accounts in France and Germany alone. yet, the conversion rate remains abysmal. The campaign operates on a “hollow echo” principle: millions see the headlines, few interact, and even fewer believe them.

This is compounded by the rapid decay of campaign assets. In September 2024, the U. S. Department of Justice seized 32 domains associated with the operation. Within 24 hours, operatives had registered new, nearly identical domains (using TLDs like. cc,. co, and. pw) to replace them. This high-churn model accepts that 90% of assets be burned within days, relying on the remaining 10% to slip through moderation nets. The table breaks down the campaign’s penetration by country, highlighting the heavy focus on Germany and France even with the low engagement yield.

Table 23. 1: Doppelganger Campaign Penetration by Target Country (May 2023 , July 2024)

Target Country Total Campaigns Identified Total Verified Clicks Avg. Clicks per Campaign Primary Narrative Focus
Germany 2, 250 250, 061 111 Economic collapse, Energy emergency
France 2, 245 249, 481 111 Anti-NATO, Social unrest
United States 1, 024 180, 521 176 Election integrity, Border security
Ukraine 1, 339 148, 777 111 Military morale, Corruption
Israel 221 N/A (Low Volume) <50 Regional instability

Algorithmic Parasitism and Bot Inflation

To mask the absence of genuine human interest, Doppelganger employs “algorithmic parasitism.” VIGINUM, the French agency responsible for defending against foreign digital interference, detected that the campaign uses a technique called typosquatting . Operatives do not just post links; they use thousands of inauthentic X (formerly Twitter) accounts to reply to legitimate news threads with links to the fake clones (e. g., nato. ws or newsroad. online). This creates a false consensus effect, making fringe narratives appear mainstream.

yet, platform data shows these bots frequently misfire. In one documented instance from June 2024, a network of 5, 000 fake accounts was removed by Meta. Analysis showed that over 80% of the interactions on their posts came from other accounts within the same bot network, creating a closed loop of engagement that failed to penetrate real user bubbles. The campaign talks to itself, inflating metrics to satisfy quotas set by Kremlin contractors like the Social Design Agency (SDA), rather than achieving genuine psychological impact.

Investigative Note: The high volume of “clicks” in the U. S. sector (176 avg) compared to Europe suggests a higher susceptibility to clickbait headlines in the American information environment, or a more aggressive bot-driven click fraud strategy targeting U. S. ad exchanges.

Visualizing the Engagement Gap

The following chart description illustrates the massive gap between the campaign’s paid reach and its actual verified engagement. It highlights the of the Russian “firehose of falsehood” model when applied to sophisticated western digital ecosystems.

Chart 23. 1 Description: The “Funnel of Failure” , Reach vs. Reality (2023-2024)
A dual-axis bar and line chart. The left axis (Bar, Logarithmic ) represents “chance Reach/Impressions,” showing a towering bar at 38, 000, 000 for the France/Germany sector. The right axis (Line, Linear ) represents “Verified User Clicks,” showing a flat line hovering at just ~500, 000 for the same region. The chart is color-coded with “Inorganic/Paid Reach” in worrying red and “Authentic Engagement” in a barely visible grey, visually demonstrating that 98. 7% of the campaign’s footprint is noise, not signal.

The Whac-A-Mole Failure: Resilience of the Infrastructure

The containment strategy employed by Western governments and tech platforms against the Doppelganger network has collapsed into a game of “Whac-A-Mole,” characterized by a widespread failure to the underlying infrastructure. even with high-profile enforcement actions, the operational tempo of the campaign has accelerated rather than diminished. A definitive example occurred in September 2024, when the U. S. Department of Justice seized 32 domains associated with the campaign. Within 24 hours, researchers at the DFRLab identified 12 newly registered domains, such as washingtonpost. pm and bild. ltd, that were fully operational and populating with fresh, Kremlin-aligned content. This rapid reconstitution confirms that physical domain seizures address only the symptoms, leaving the production pipeline intact.

The resilience of this network relies on a sophisticated “cloaking” architecture that separates the user-facing content from the backend. Investigations by Qurium and Check have identified a traffic distribution system (TDS) known as “Kehr” as a central component. This service acts as a digital gatekeeper, analyzing incoming traffic to distinguish between real users and automated crawlers from security vendors or social media platforms. When a bot from Meta or X (formerly Twitter) attempts to inspect a link, Kehr serves a benign, innocuous page. When a targeted European or American user clicks the same link, they are redirected through a chain of intermediate domains to the disinformation content. This technique allows the campaign to bypass automated moderation filters with near-impunity.

Doppelganger Infrastructure Recovery Timeline (Sept 2024)
Event Date/Time Action Taken Outcome
DOJ Seizure Sept 4, 2024 Seizure of 32 domains (e. g., washingtonpost. pm) Sites taken offline globally.
Network Response Sept 5, 2024 Registration of new TLDs (. co,. cc,. pw) 12+ new mirrors active within 24 hours.
Content Migration Sept 5, 2024 Restoration of archive content Articles from 2022, 2024 migrated to new URLs.
Traffic Redirection Sept 6, 2024 Update of “Kehr” redirect rules Social media botnets point to new domains.

Financial data reveals that the operators, primarily the Social Design Agency (SDA) and Structura National Technologies, have successfully integrated into the legitimate advertising economy of the platforms they attack. Between August 2023 and November 2024, the SDA spent approximately $338, 000 on Meta advertisements to promote their content, even with being under strict EU and US sanctions. These funds purchased at least 8, 000 separate sponsored posts that targeted users in France, Germany, and Poland. The ability of a sanctioned Russian entity to transact hundreds of thousands of dollars on American platforms highlights a catastrophic gap in Know Your Customer (KYC) enforcement and ad verification.

The of content generation further overwhelms manual debunking efforts. In the 12-month period ending May 2024, the network published over 12, 970 articles in German alone, averaging one new piece of disinformation every 50 minutes. This volume is sustained by “Aeza,” a bulletproof hosting provider that shields the campaign’s backend servers from takedown requests. By leveraging European hosting resellers and legitimate content delivery networks (CDNs) like Cloudflare, the operators mask their origin IPs, making technical attribution a slow and labor-intensive process for defenders. The infrastructure is not surviving; it is evolving to treat domain bans as a negligible operating cost.

Future Projections: Deepfake Integration and Voice Synthesis

The trajectory of the Doppelganger campaign has shifted from static mimicry to kinetic fabrication. While the operation’s initial phase relied on cloning the visual identity of Western media outlets, copying CSS frameworks, fonts, and bylines, the 2024-2025 operational pattern introduced a more volatile element: the systematic integration of AI-generated audio and video. This evolution transforms typosquatted domains from passive repositories of fake text into active broadcast hubs for synthetic reality. The infrastructure built to host washingtonpost. pm or bild. ltd serves as the delivery method for “deepfake” content that is far harder to debunk than written disinformation.

Intelligence reports from late 2024 indicate that the operators behind Doppelganger, primarily the Social Design Agency (SDA) and Structura National Technologies, began embedding AI-generated video clips directly into their cloned articles. These clips frequently feature synthetic news anchors or impersonated officials, designed to bypass the skepticism readers might apply to text alone. By 2025, the distinction between a “fake news site” and a “fake broadcast” had evaporated. The operation no longer needs to hire actors or build sets; commercial off-the-shelf AI tools generate “on-the-ground” reporting from non-existent war zones or fabricate press conferences with Western leaders.

The integration of generative AI into the Doppelganger framework allows for the automated production of ‘evidence.’ A cloned article claiming a corruption scandal is no longer just text; it is accompanied by a fabricated audio recording of the target, generated in seconds for pennies.

The most dangerous application of this technology has been the weaponization of voice synthesis. Unlike video deepfakes, which frequently suffer from visual artifacts like poor lip-syncing or unnatural blinking, AI audio is nearing indistinguishability from reality. In the lead-up to the 2024 U. S. and European elections, Doppelganger-linked domains circulated audio clips purporting to be “leaked wiretaps” of military generals and political candidates. These clips were in articles mimicking Der Spiegel or Le Monde, lending the forgery the borrowed credibility of the host brand. A September 2024 report by the Microsoft Threat Analysis Center (MTAC) confirmed that Russian operatives were shifting focus to AI-enhanced audio because it offers higher engagement with lower detection rates than video.

The operational logic follows a cost-benefit analysis. Creating a “cheapfake”, a crude manipulation, risks immediate exposure. yet, high-quality deepfakes previously required significant compute power and technical expertise. The democratization of generative AI models in 2023 and 2024 collapsed this barrier. Doppelganger operatives can feed a Kremlin-approved script into a text-to-video model and output a clip of a trusted news anchor delivering the lines in fluent German, French, or English. This capability was highlighted in August 2025, when security researchers exposed the “Storm-1679” network, which shares tactical overlap with Doppelganger, impersonating BBC and CNN anchors to spread false narratives about the Ukraine war.

Comparative Analysis: Static vs. Kinetic Doppelganger Tactics

The following table outlines the operational shift observed between the campaign’s inception and its current state in 2026.

Operational Metric Phase I: Static Cloning (2022-2023) Phase II: Kinetic Synthesis (2024-2025)
Primary Content Text articles, stock photos, static cartoons. AI-generated video reports, synthetic audio “leaks.”
Production Time Hours (manual translation/coding). Minutes (automated script-to-video pipelines).
Credibility Anchor Visual design (logos, fonts). Sensory evidence (voice, video motion).
Detection Difficulty Low (URL inspection, bad grammar). High (Audio forensics required).
User Engagement Read-through rates frequently low. Video completion rates significantly higher.

The “Wolf News” incident, identified by Graphika in early 2023, served as a prototype for this integration. While those early avatars were stiff and visibly artificial, the 2025 iterations deployed on Doppelganger domains exhibit micro-expressions, natural breathing patterns, and perfect accent matching. This technical leap allows the campaign to target specific demographics with hyper-localized content. A cloned version of a Bavarian local newspaper, for instance, can feature a video report delivered in a regionally accurate dialect, discussing fabricated crimes by migrants to incite local unrest.

also, the automation of this pipeline suggests a move toward “personalized” disinformation. Instead of a single fake article broadcast to millions, the infrastructure can theoretically generate unique video variants tailored to different user segments. If a user clicks a link from a Telegram channel focused on economic anxiety, the cloned site presents a video emphasizing inflation. If the traffic comes from an anti-war group, the same URL serves a deepfake about military escalation. This content generation represents the final frontier of the Doppelganger evolution: a malleable reality that adjusts itself to the fears of the individual viewer.

The 2025 RUSI report on Russian unconventional weapons noted that “Doppelganger’s adoption of AI is not an upgrade in quality, a change in doctrine.” The goal is no longer to convince the audience that a specific lie is true, to saturate the information environment with enough synthetic noise that the audience abandons the search for truth entirely. When a reputable news site and its clone both feature video evidence, and the average user cannot distinguish the real anchor from the AI synthesis, the credibility of the legitimate press is neutralized.

Final Verdict: The Permanent State of Digital Siege

The trajectory of the Doppelganger campaign between 2022 and early 2026 demonstrates a structural failure in Western digital defense method. While the United States Department of Justice successfully seized 32 domains linked to the Social Design Agency (SDA) and Structura National Technology in September 2024, the operational impact was negligible. Forensic analysis by the Digital Forensic Research Lab (DFRLab) confirmed that within 24 hours of the seizure, Russian operators had registered and populated 12 new domains to replace the lost infrastructure. This rapid recidivism indicates that the Doppelganger protocol has evolved from a finite campaign into a persistent, automated of the internet.

The resilience of this network relies on a fundamental economic asymmetry. The draft Russian federal budget for 2025 allocated approximately 137. 2 billion rubles ($1. 42 billion) for state information initiatives. This funding allows operators to treat domains, servers, and bot accounts as disposable munitions. Conversely, the cost for platforms like Meta and regulatory bodies like the European Commission to identify, adjudicate, and remove these assets remains disproportionately high. The defense operates on a legal timeline. The offense operates on a programmatic one.

Table 26. 1: The Asymmetry of Information Warfare (2025-2026 Estimates)
Metric Russian Offensive Operations (Doppelganger) Western Defensive Response (Platforms/Gov)
Unit Cost $10, $15 (Domain Registration) $5, 000+ (Investigation & Legal Process)
Time to Deploy 15 Minutes (Automated Scripting) 48 Hours to 6 Months (Detection to Takedown)
Content Generation Instant (Generative AI/LLMs) Manual Verification Required
Regulatory load Zero (Ignores all laws) High (Must comply with DSA/ Amendment)

Technological in 2025 further entrenched this. The integration of generative AI into the SDA’s workflow removed the primary bottleneck of human labor. Earlier iterations of Doppelganger required human copywriters to draft fake articles for sites like Bild or The Washington Post. By late 2025, large language models generated these forgeries instantly in native German, French, and Polish. Meta’s Q2 2024 Adversarial Threat Report noted the increasing use of Generative Adversarial Networks (GANs) to create unique, undetectable profile photos for bot accounts. This automation allows the network to flood information spaces with a volume of content that manual moderation teams cannot match.

“The Justice Department is seizing 32 internet domains… yet the Russian operation remains active; newly created and restored websites swiftly emerged to replace those taken down.” , DFRLab Assessment, October 2024

The targeting scope also expanded significantly in the 18 months leading up to March 2026. While the initial focus remained on eroding support for Ukraine, the apparatus pivoted to exploit domestic fissures within NATO member states. In April 2025, the network launched a coordinated assault on the Polish presidential elections using 279 distinct X (formerly Twitter) accounts to amplify anti-EU narratives. Similar infrastructure targeted the German federal elections. The goal shifted from convincing users of a specific lie to exhausting their capacity to distinguish truth from fiction. This strategy aligns with the “firehose of falsehood” model where the volume of noise is the weapon itself.

Regulatory frameworks such as the EU’s Digital Services Act (DSA) have proven insufficient against this specific threat vector. Doppelganger operators use “cloaking” techniques and three-stage redirect chains to hide the final destination of a link from platform crawlers. A user clicks a benign-looking link on Facebook. They pass through a middleware server. They land on a fake Le Monde article hosted on a server in a non-compliant jurisdiction. By the time regulators trace the chain, the disposable domain has already been discarded. The November 2025 proposal by the European Commission to establish a “Centre for Democratic Resilience” acknowledges this gap. Yet it remains a bureaucratic response to a software problem.

The evidence confirms that Doppelganger is no longer a discrete operation that can be “defeated” through sanctions or domain seizures. It has become a permanent feature of the digital environment. The infrastructure is decentralized. The funding is state-guaranteed. The content is AI-generated. Western democracies face a permanent state of digital siege where the integrity of the information ecosystem requires constant, automated maintenance rather than sporadic intervention.

**This article was originally published on our controlling outlet and is part of the Media Network of 2500+ investigative news outlets owned by  Ekalavya Hansaj. It is shared here as part of our content syndication agreement.” The full list of all our brands can be checked here. You may be interested in reading further original investigations here

Request Partnership Information

About The Author
Ekalavya Hansaj

Ekalavya Hansaj

Part of the global news network of investigative outlets owned by global media baron Ekalavya Hansaj.

Ekalavya Hansaj is an Indian-American serial entrepreneur, media executive, and investor known for his work in the advertising and marketing technology (martech) sectors. He is the founder and CEO of Quarterly Global, Inc. and Ekalavya Hansaj, Inc. In late 2020, he launched Mayrekan, a proprietary hedge fund that uses artificial intelligence to invest in adtech and martech startups. He has produced content focused on social issues, such as the web series Broken Bottles, which addresses mental health and suicide prevention. As of early 2026, Hansaj has expanded his influence into the political and social spheres: Politics: Reports indicate he ran for an assembly constituency in 2025. Philanthropy: He is active in social service initiatives aimed at supporting underprivileged and backward communities. Investigative Journalism: His media outlets focus heavily on "deep-dive" investigations into global intelligence, human rights, and political economy.