Can AI help Russia Decisively Improve its Information War Against the West?
The AI arms race is definitely ‘gathering speed’. What about the information wars? ‘Countries that are very good at AI could gain significant leverage in international relations through influence operations’, says the Military Review in its March-April 2025 issue. It is, therefore, necessary to consider the threats Russia has presented and could present in the future with AI-powered information operations.
After the 2008 war with Georgia, Russian authorities concluded they were lagging their western rivals in information-war capabilities, especially propaganda. There is genuine concern in Russia about the potential for regime destabilisation implicit in this inferiority. The Russians consider the information-war a core dimension of national security. Shortly after the Georgian war started, so did a long-term effort to catch up – which faced multiple obstacles. One obstacle was the relative lack of financial resources, which made it unfeasible to take the massive media empires of the West head on. However, the rapid spread of social media has offered the Russians a low-cost vehicle for becoming more competitive.
Another obstacle was the rigidity and bureaucratism of the state apparatus, including of the Russian intelligence agencies. Attracting the new skills deemed necessary for re-launching Russian information-war capabilities appeared to be problematic, at least in the short term. Hence, the agencies started tapping into the commercial sector, where information manipulation techniques, typically imported from the West, were already in widespread use in the world of corporate rivalry and advertising. Among the actors involved in this effort was Evgenyi Prigozhin’s internet Research Agency, before Wagner Group even emerged.
There is no doubt that Russia’s capabilities in the field have massively increased since. Wagner Group’s information operations in Africa have widely been acknowledged to be effective in undermining France’s influence and expanding Russia’s. The quality of Russian information operations in Ukraine is also claimed to be improving. Allegations that Russian information operations decisively influenced the 2016 elections in the US or the 2024 presidential elections in Romania are more questionable.
The Role of Western Media
Whatever the specific tactical impact of the various information operations launched by the Russians in recent years and despite the unquestionably considerable investment and intensification of efforts, the Russians still seem to believe they are behind, and countless volumes and articles repeat how big a threat this is. They know that social media-focused efforts have limited impact if they do not integrate popular influencers and in turn do not cooperate with more traditional media. While it helps that Western media unwittingly amplify the impact of Russian information operations by portraying them as hugely impactful, this only adds to the perception of Russia as a major actor in the field but does little to help Russia achieve specific objectives. Indeed, Wagner Group’s success in Africa was largely due to its ability to integrate its own information operations with local media and influencers.
As generative AI becomes more accessible and more powerful, it lowers the barrier to entry for a wider ecosystem of pro-Russian actors
In many theatres, such as European ones, there is limited or non-existent potential for Russia to cooperate with major media organisations. Relatively popular influencers can be easier to co-opt or enlist in Russian operations, but the fact remains that nothing comparable to the Africa operations can be replicated in Europe or North America. Here, the scope of Russian information operations necessarily must be more limited. In practice, they tend to be focused on trying to discredit specific institutions, such as governments and institutions (the EU) deemed to be hostile to Russia, and on sowing distrust between the populace and institutions, and on promoting narratives of unreliability among states and other actors.
In this context, it is worth asking whether the Russian government’s recent enthusiasm for AI reflect the belief that it could contribute significantly to its information war efforts. As generative AI becomes more accessible and more powerful, it lowers the barrier to entry for a wider ecosystem of pro-Russian actors – not just state agencies but also hacktivists and online influencers – to experiment with and gradually operationalise these tools in increasingly sophisticated ways. These experiments are low cost and bear little consequence for the practitioners.
Disinformation actors affiliated with the Russian state are already thought to have invested heavily in AI-technologies to influence European audiences in the run up to the 2024 European Parliament elections. Generative AI is already being integrated into Russian information wars in a number of ways. One prominent example is the automated generation of synthetic content, including fake articles, social media posts, manipulated images, and deepfake audio or video. The focus often is on mimicking legitimate Western news outlets, with the obvious intent of undermining public trust and muddying the information environment. Another strategy, aimed at flooding online discourse with misleading content and creating the appearance of grassroots support for particular viewpoints, threatens to poison public debates. This strategy relies on AI-enhanced bots and automated social media accounts.
Misleading Large Language Models
Russian actors have also started to experiment with techniques such as LLM grooming, which consists of injecting on a large-scale (millions) of items of propaganda or biased material designed to skew the outputs of Large Language Models (LLMs). The goal is to indirectly shape AI-generated narratives and outputs by contaminating the content they are trained on. The main idea is to use AI as a force multiplier, automating and scaling influence operations.
However, the online chats among Russian AI enthusiasts also reveal a clear anxiety over the West’s comparative proficiency in the AI field, as experienced on a daily basis on the Ukrainian cyber frontline, where Ukrainian and Western intelligence agencies reportedly increasingly rely on AI to improve the quality of their propaganda output. Examples include the supposedly AI-generated fake intercepted communications between Russian soldiers and their relatives, deepfake videos falsely attributed to Russian officials, and manipulative social media campaigns, arguing that these represent coordinated psychological operations designed to weaken Russian resolve and morale.
Despite significant enthusiasm surrounding the strategic integration of AI in information warfare, there is also substantial criticism and frustration regarding the limitations of Russia’s domestic AI platforms, primarily Sberbank’s GigaChat and YandexGPT. Users complained on social media that these Russian services were less responsive and provided inferior responses compared to Western tools. Because the underlying AI architecture relies on foreign-developed technologies such as Midjourney and GPT, Russian AI tools tend to be politically unreliable as well, for example refusing to recognise Russia’s annexation of Crimea and Donbass. These shortcomings, aside from providing material for cheap jokes, undermine the Kremlin’s narrative of taking Russia to the forefront of AI development to account for a ‘significant share’ of the global AI market by 2030.
As noted by a Russian expert, at present Russia is too far behind the US and China to catch up. Maybe the on-going investment in Russia AI capabilities will pay dividends, but neither the US, nor China, nor Western Europeans are going to be idle. For Russia, improving on yesterday’s capabilities is not enough; it should improve faster than its geopolitical rivals, which is a much taller order.
© RUSI, 2025.
The views expressed in this Commentary are the author's, and do not represent those of RUSI or any other institution.
For terms of use, see Website Terms and Conditions of Use.
Have an idea for a Commentary you'd like to write for us? Send a short pitch to commentaries@rusi.org and we'll get back to you if it fits into our research interests. View full guidelines for contributors.
WRITTEN BY
Dr Antonio Giustozzi
Senior Research Fellow
Terrorism and Conflict
- Jim McLeanMedia Relations Manager+44 (0)7917 373 069JimMc@rusi.org