AI Risk Management Platform Uncovers Russian Influence Operation
In a recent investigation, Alethea’s AI risk management platform, Artemis detected and decoded the lifecycle of a fabricated story published and pushed by the Russian influence operation known as Storm-1516. Beginning with the earliest signals including a spoofed domain and the first post to amplify a false story, Artemis revealed how the campaign was constructed, which accounts were driving it across multiple platforms, and traced patterns of rapid pushback from credible voices that show how timely intervention can help blunt a campaign’s overall spread and impact.
For modern organizations seeking to maintain information integrity and public trust, this brief case study demonstrates the necessity of visibility into the mechanics behind false narratives and narrative manipulation. To support these needs, next-gen digital threat intelligence platforms must expose not just how disinformation moves, but reveal the actors, networks, and tactics that drive it, enabling faster, more targeted interventions and strengthening resilience against future threats.
Detecting the First Drops of Inauthentic Activity
On October 27, 2025, posts appeared with an embedded video claiming that Ukrainian President Volodymyr Zelenskyy had recently purchased the Pathfinder Ranches in Wyoming for $79 million through an offshore company called Davegra Ltd. The video went on to say that the 916,076-acre property was “a property so vast that it’s bigger than the entire state of Rhode Island”.
The video mimicked actual real estate marketing materials, pairing fabricated narrative with doctored visuals—a tactic frequently used to create a veneer of legitimacy. Right away, it was flagged as false or manipulated content, with clear signals pointing to it being the work of the long-standing Russian influence operation, Storm-1516. Artemis helped confirm those suspicions when its “Domains” panel surfaced a website, swanlandco[.].us, created to impersonate the legitimate realtor for the Pathfinder property. The fraudulent swanlandco[.]us domain was first registered on October 21, 2025 – 6 days before the start of the influence campaign—signaling clear forethought and pre-planning consistent with prior Storm-1516 campaigns.

Operation Overview: What Is Storm-1516?
Storm‑1516 is a Kremlin-linked influence operation that has been active since at least late 2023, specializing in crafting and disseminating high-impact false narratives across the U.S. and Europe, often targeting elections and political figures such as President Zelenskyy. The group often utilizes forged “whistleblower” videos, staged actors, fake news websites, and AI-generated media, then launders the output through networks of social media accounts that are then picked up by pseudo-media outlets to create an appearance of independent reporting.
Their campaigns have frequently targeted President Zelenskyy to seed corruption narratives and weaken Western support for Ukraine. In fact, Storm-1516 has attempted to tie him to at least 13 prior mega-million-dollar real estate deals, all of which were proven false. The group has also broadened its focus to Western democratic politics, targeting high-profile figures like former U.S. Vice President Kamala Harris and Governor Tim Walz, with accusations of corruption and viral content about alleged electoral fraud. Analysts, including French agency Viginum and U.S.-based labs, assess that the group meets the criteria of foreign digital interference, given its scale, sophistication, and focus on undermining trust in institutions.
Crosswind: Tracking Platform-to-Platform Acceleration of False Narratives
In this investigation, Artemis collected 3538 relevant instances of content posted related to this narrative between October 27 and October 30 across 10 monitored social media platforms, 8 news sources, and outbound links to YouTube, Bitchute, TikTok, Facebook, and Instagram. The breadth of the data revealed how quickly the narrative was engineered to appear widespread.
11:42 a.m. (CT) - The first instance of the posted video occurred on X on October 27 by the account @Its_the_Dr —an account previously linked to past Storm-1516 content.
https://x.com/its_the_dr/status/1982850360889786611
1:20 p.m. - A second instance on X followed from the account @MrPotatoHeadUSA, and 11 more posts and/ or reposts circulated before the narrative jumped platforms. Two large language models (LLMs) models were used on X to inquire about the validity of the claims. Both @Grok and @AskPerplexity questioned the validity of the claims with varying degrees of detail and certainty, and both hinted at similarities to previous malign influence operations.
https://x.com/MrPotatoHeadUSA/status/1982875066389147682
3:03 p.m. - Three hours and 21 minutes after the original post, the content appeared on a second platform, Truth Social. From there, it propagated in rapid succession.
https://truthsocial.com/@BlazingPatriots/posts/115447915577720981
3:42 p.m. - The content then crossed over to a third platform, Gab, from the account @GolddiggerPatriot.
https://gab.com/GolddiggerPatriot/posts/115448069028981166
3:43 p.m. - The content began to gain more traction globally when a French social media influencer, Silvano Trotta (@silvano_trotta, 260.7K followers), posted the content, receiving at least 37.2K views, 1.1K likes, 920 reposts, 43 quotes, 121 comments, and 107 bookmarks.

https://x.com/silvano_trotta/status/1982911137885868439
5:33-5:38 p.m. - A well-known propagator of past Russian influence operation content, SGAnon, posted this content to Telegram, then to X at 5:36, and then Truth Social at 5:38.

https://t.me/realqnewspatriot/8637

https://x.com/TheQNewsPatriot/status/1982939386997014936
https://truthsocial.com/@RealSGAnon/posts/115448525911445978
5:50 → 5:57 → 6:06 p.m. - The narrative then crossed over to the platforms Reddit, Minds, and Patriots.Win (respectively).
https://www.minds.com/newsfeed/1829680729148100608

https://patriots.win/c/GreatAwakening/p/1ARJr3rwbm/zelenskys-offshore-company-daveg/c
3:04 a.m. [October 28] - The post showed up on Gettr for the first time.

https://gettr.com/post/p3tj2p458f3
This pattern is consistent with Storm-1516’s laundering approach: publish first on Telegram or X, reamplify through known propagator accounts, then push the narrative into ideologically aligned communities across smaller platforms to create the illusion of breadth and legitimacy.
Because accounts like @its_the_dr, @TheQNewsPatriot, @SlavFreeSpirit, @SororInimicorum, and @daniel_gugger were tagged in Artemis as previous propagators of Storm-1516 content, the Alethea investigations team could quickly pivot to track their cross-platform activity and see when a particular actor propagated the content and how their tactics differed on different channels.
The Winds Shift: Credible Voices Blunt Virality
According to the timeline generated by Artemis, the first posts calling out the content as Russian disinformation and attributing the campaign to Storm-1516 occurred two hours later from the account Gnida Project (@gnidaproject, 634 followers) on X and Bluesky and was retweeted by the Russian disinformation research account @antibot4navalny (@antibot4navalny, 14.1K followers). Like Alethea, both accounts have previously reported on Storm-1516 operations and generate significant and qualitative research on foreign influence operations.

https://x.com/gnidaproject/status/1982881833424675199
The first mainstream reporter to “push back” on the content was senior BBC journalist Shayan Sardarizadeh (@Shayan86, 181.0K followers), roughly 8 hours after its appearance. The attention from a credible reporter and the research community shifted the dynamic, short-circuiting the potential virality of the campaign. Because of the increasing scrutiny in the replies to the original post, @its_the_dr deleted it later that same day.
This timeline stands in stark contrast with previous Storm-1516 campaigns, which often accumulate millions of views and circulate for days before facing meaningful debunking. In this instance, rapid detection and credible pushback throttled the narrative before it could fully take hold.
https://x.com/Shayan86/status/1982972534506033435
The Infrastructure Behind the Hoax
Artemis’ domain analysis revealed two different domains within the data set:
- swanlandco[.]us - fraudulent
- swanlandco[.]com - legitimate
The .us domain contained obvious reg flags: anonymous registration, a fictitious proxy with a Rochester, NY address, and a ProtonMail contact. In contrast, the .com domain matched the established realtor for Pathfinder Ranches.

Share counts by domain
Within 24 hours of the video circulating, Swan Land Company representatives issued multiple statements on both X and YouTube confirming that both the video and the website swanlandco[.]us were fictitious and that President Zelenskyy was not the buyer of the Pathfinder Ranches in Wyoming. A local Wyoming paper, the Cowboy State Daily, also posted an article on October 28 refuting the claims.

https://x.com/SwanLandCompany/status/1983244997546963350
https://cowboystatedaily.com/2025/10/28/no-zelenskyy-isnt-buying-916-000-acre-pathfinder-ranches-but-someone-is/
Key Takeaways: Risk Resilience in the Age of AI Begins Before the Narrative Breaks
The Zelenskyy ranch hoax makes one thing clear: resilience in today’s information environment depends on seeing threats early and moving faster than the narratives built to destabilize you. Storm-1516’s fabricated video spread because:
- It was engineered to jump platforms
- Gained borrowed legitimacy from known propagator accounts
- And blended AI-generated artifacts with real-world details
It failed to go fully viral because credible voices intervened within hours, cutting off its momentum before it could reshape public perception. Speed has become a key competitive advantage, and leaders who still plan around a 24-hour crisis cycle are already operating at a disadvantage.
This investigation also reveals why cross-platform visibility matters as much as speed. Influence actors don’t rely on a single channel; they launder their content through ecosystems of accounts across X, Truth Social, Telegram, Gab, and beyond. A single takedown or post-level correction no longer neutralizes a threat. Organizations need tools that can map how a narrative travels, identify who amplifies it, and understand which communities are likely to pick it up next.
And while the forged video was the primary driver of the narrative, the more strategic signal came earlier: the spoofed domain was created nearly a week before the first post.
Influence operations are built long before they hit social media, which means true preparedness requires watching for the infrastructure— domains, proxies, certificates — that telegraph an attack’s intent. Combined with the accelerating realism of AI-generated media, this means manual verification simply cannot keep up.
What This Means for Business Leaders
For business leaders, the takeaway is not doom but direction: resilience is possible when organizations shift from reacting to viral content to anticipating how manipulation forms. With platforms like Artemis providing early warnings, cross-platform detection, infrastructure signals, and AI-powered analysis, organizations can protect their brands, executives, and stakeholders long before misinformation reaches the point of impact.

