Alethea’s AI risk management platform, Artemis, detects social media manipulation at scale and tracks coordinated influence campaigns as they unfold. This visibility is particularly valuable for retrospective investigations, enabling researchers to trace narrative spread across multiple social platforms and pinpoint the earliest appearance of the targeted content.
In a recent investigation, Artemis surfaced the full lifecycle of a fabricated story published and pushed by the Russian influence operation known as Storm-1516. By detecting the campaign’s earliest signals—such as the registration of a spoofed domain or the first post amplifying a false story—Artemis revealed how it was constructed and which accounts were driving it. Equally important, our platform also traced patterns of rapid pushback from credible voices, showing how timely intervention helped blunt the campaign’s overall spread and impact.
Understanding the mechanics behind false narratives and narrative manipulation is essential for maintaining information integrity and public trust. AI risk management and digital threat intelligence platforms like Artemis not only expose how disinformation moves, but uncovers the actors, networks, and tactics that drive it. This level of insight enables faster, more targeted interventions, strengthening societal resilience against future influence operations.
On October 27, 2025, posts appeared with an embedded video claiming that Ukrainian President Volodymyr Zelenskyy had recently purchased the Pathfinder Ranches in Wyoming for $79 million through an offshore company called Davegra Ltd. The video went on to say that the 916,076-acre property was “a property so vast that it’s bigger than the entire state of Rhode Island”.
The video mimicked actual real estate marketing materials, pairing fabricated narrative with doctored visuals—a tactic frequently used to create a veneer of legitimacy. Right away, it was flagged as false or manipulated content, with clear signals pointing to it being the work of the long-standing Russian influence operation, Storm-1516. Artemis helped confirm those suspicions when its “Domains” panel surfaced a website, swanlandco[.].us, created to impersonate the legitimate realtor for the Pathfinder property. The fraudulent swanlandco[.]us domain was first registered on October 21, 2025 – 6 days before the start of the influence campaign—signaling clear forethought and pre-planning consistent with prior Storm-1516 campaigns.
Storm‑1516 is a Kremlin-linked influence operation that has been active since at least late 2023, specializing in crafting and disseminating high-impact false narratives across the U.S. and Europe, often targeting elections and political figures such as President Zelenskyy. The group often utilizes forged “whistleblower” videos, staged actors, fake news websites, and AI-generated media, then launders the output through networks of social media accounts that are then picked up by pseudo-media outlets to create an appearance of independent reporting.
Their campaigns have frequently targeted President Zelenskyy to seed corruption narratives and weaken Western support for Ukraine. In fact, Storm-1516 has attempted to tie him to at least 13 prior mega-million-dollar real estate deals, all of which were proven false. The group has also broadened its focus to Western democratic politics, targeting high-profile figures like former U.S. Vice President Kamala Harris and Governor Tim Walz, with accusations of corruption and viral content about alleged electoral fraud. Analysts, including French agency Viginum and U.S.-based labs, assess that the group meets the criteria of foreign digital interference, given its scale, sophistication, and focus on undermining trust in institutions.
In this investigation, Artemis collected 3538 relevant instances of content posted related to this narrative between October 27 and October 30 across 10 monitored social media platforms, 8 news sources, and outbound links to YouTube, Bitchute, TikTok, Facebook, and Instagram. The breadth of the data revealed how quickly the narrative was engineered to appear widespread.
11:42 a.m. (CT) - The first instance of the posted video occurred on X on October 27 by the account @Its_the_Dr —an account previously linked to past Storm-1516 content.
1:20 p.m. - A second instance on X followed from the account @MrPotatoHeadUSA, and 11 more posts and/ or reposts circulated before the narrative jumped platforms. Two large language models (LLMs) models were used on X to inquire about the validity of the claims. Both @Grok and @AskPerplexity questioned the validity of the claims with varying degrees of detail and certainty, and both hinted at similarities to previous malign influence operations.
3:03 p.m. - Three hours and 21 minutes after the original post, the content appeared on a second platform, Truth Social. From there, it propagated in rapid succession.
3:42 p.m. - The content then crossed over to a third platform, Gab, from the account @GolddiggerPatriot.
3:43 p.m. - The content began to gain more traction globally when a French social media influencer, Silvano Trotta (@silvano_trotta, 260.7K followers), posted the content, receiving at least 37.2K views, 1.1K likes, 920 reposts, 43 quotes, 121 comments, and 107 bookmarks.
5:33-5:38 p.m. - A well-known propagator of past Russian influence operation content, SGAnon, posted this content to Telegram, then to X at 5:36, and then Truth Social at 5:38.
5:50 → 5:57 → 6:06 p.m. - The narrative then crossed over to the platforms Reddit, Minds, and Patriots.Win (respectively).
3:04 a.m. [October 28] - The post showed up on Gettr for the first time.
This pattern is consistent with Storm-1516’s laundering approach: publish first on X, reamplify through known propagator accounts, then push the narrative into ideologically aligned communities across smaller platforms to create the illusion of breadth and legitimacy.
Because accounts like @its_the_dr, @TheQNewsPatriot, @SlavFreeSpirit, @SororInimicorum, and @daniel_gugger were tagged in Artemis as previous propagators of Storm-1516 content, the Alethea investigations team could quickly pivot to track their cross-platform activity and see when a particular actor propagated the content and how their tactics differed on different channels.
According to the timeline generated by Artemis, the first posts calling out the content as Russian disinformation and attributing the campaign to Storm-1516 occurred two hours later from the account Gnida Project (@gnidaproject, 634 followers) on X and Bluesky and was retweeted by the Russian disinformation research account @antibot4navalny (@antibot4navalny, 14.1K followers). Like Alethea, both accounts have previously reported on Storm-1516 operations and generate significant and qualitative research on foreign influence operations.
The first mainstream reporter to “push back” on the content was senior BBC journalist Shayan Sardarizadeh (@Shayan86, 181.0K followers), roughly 8 hours after its appearance. The attention from a credible reporter and the research community shifted the dynamic, short-circuiting the potential virality of the campaign. Because of the increasing scrutiny in the replies to the original post, @its_the_dr deleted it later that same day.
This timeline stands in stark contrast with previous Storm-1516 campaigns, which often accumulate millions of views and circulate for days before facing meaningful debunking. In this instance, rapid detection and credible pushback throttled the narrative before it could fully take hold.
Artemis’ domain analysis revealed two different domains within the data set:
The .us domain contained obvious reg flags: anonymous registration, a fictitious proxy with a Rochester, NY address, and a ProtonMail contact. In contrast, the .com domain matched the established realtor for Pathfinder Ranches.
Within 24 hours of the video circulating, Swan Land Company representatives issued multiple statements on both X and YouTube confirming that both the video and the website swanlandco[.]us were fictitious and that President Zelenskyy was not the buyer of the Pathfinder Ranches in Wyoming. A local Wyoming paper, the Cowboy State Daily, also posted an article on October 28 refuting the claims.
The Zelenskyy ranch hoax makes one thing clear: resilience in today’s information environment depends on seeing threats early and moving faster than the narratives built to destabilize you. Storm-1516’s fabricated video spread because:
It failed to go fully viral because credible voices intervened within hours, cutting off its momentum before it could reshape public perception. Speed has become a key competitive advantage, and leaders who still plan around a 24-hour crisis cycle are already operating at a disadvantage.
This investigation also reveals why cross-platform visibility matters as much as speed. Influence actors don’t rely on a single channel; they launder their content through ecosystems of accounts across X, Truth Social, Telegram, Gab, and beyond. A single takedown or post-level correction no longer neutralizes a threat. Organizations need tools that can map how a narrative travels, identify who amplifies it, and understand which communities are likely to pick it up next.
And while the forged video was the primary driver of the narrative, the more strategic signal came earlier: the spoofed domain was created nearly a week before the first post.
Influence operations are built long before they hit social media, which means true preparedness requires watching for the infrastructure— domains, proxies, certificates — that telegraph an attack’s intent. Combined with the accelerating realism of AI-generated media, this means manual verification simply cannot keep up.
For business leaders, the takeaway is not doom but direction: resilience is possible when organizations shift from reacting to viral content to anticipating how manipulation forms. With platforms like Artemis providing early warnings, cross-platform detection, infrastructure signals, and AI-powered analysis, organizations can protect their brands, executives, and stakeholders long before misinformation reaches the point of impact.