Why the UK Now Needs a National Disinformation Agency

An actor dressed as Mark Zuckerberg arriving outside Portcullis House, London, where a hearing was taking place on the impact of disinformation on democracy, 27 November 2018

Fake avatar: An actor dressed as Mark Zuckerberg arriving outside Portcullis House, London, where a hearing was taking place on the impact of disinformation on democracy, 27 November 2018. Image: PA Images / Alamy Stock


We live in an age where information is a battlefield and our adversaries are already fighting on it. To defend the UK's 'cognitive resilience', we must replicate the institutional foresight that led to the creation of the NCSC a decade ago.

In 2015, the UK made the bold decision to establish a dedicated public-facing National Cyber Security Centre (NCSC), bringing a fragmented response together under one single, empowered organization. This was a radical but necessary step, and now the 2025 Strategic Defence Review has identified a new top-tier threat: disinformation, a core part of state-sponsored hybrid warfare that requires the same ‘whole-of-society’ institutional response, which – as of yet – is absent.

The Threat Reality: Scale and Fragmentation

Russia and China view the cognitive domain as integral battlefields, embedding information warfare into their national security strategies far more deeply than does the West. For these nation-states, subversion is not a side show, it is the main event, as OpenAI's June 2025 threat intelligence report revealed.

State actors primarily misused AI platforms to support information operations rather than traditional cyber attacks. The report documents operations from China, Russia and Iran using AI to generate social media content and fake personas at unprecedented scale. One Chinese operation alone generated hundreds of coordinated comments, while Russian actors used AI to support German election interference through the ‘Portal Kombat’ network.

Modern disinformation campaigns succeed primarily through manipulation of authentic information – threat actors amplify real but carefully selected content to distort public perception and exploit algorithmic systems to create false impressions of public sentiment. These operations far exceed what traditional media regulation or intelligence agencies can address. Russia alone has reportedly invested over $1 billion in ongoing disinformation campaigns aimed at diminishing Western support for Ukraine.

Yet responsibility for addressing these campaigns remains fragmented across government departments, civil society and the private sector. The UK Cabinet Office's recent Chronic Risks Analysis identified information warfare as a systemic threat to national stability, but despite this recognition, the institutional response creates vulnerabilities that nation-state actors actively exploit.

The 2024 Southport attacks and Summer Race Riots, amplified by foreign interference, demonstrated this fragmentation. False information sparked nationwide riots within hours of three young girls being killed, while key regulators could not enforce effective actions. While one department focuses on platform regulation, another handles public messaging, with intelligence agencies or military tracking the specific threat actor – resulting in no single entity with the mandate, resources, or authority to coordinate a comprehensive response in real-time.

The Response Gap: Speed and Structure

Information warfare operates at digital speed, with adversaries adapting tactics and exploiting events within hours. Traditional government response timeframes – measured in weeks or months – are fundamentally mismatched to the operational tempo of modern information operations. Even recent regulatory advances like the Online Safety Act, while welcome, rely on reactive content takedowns that cannot match the speed at which adversaries manipulate legitimate information or pivot their strategic objectives.

quote
Intelligence agencies develop deep expertise in information warfare threats and nation-state actors, while social media platforms detect manipulation techniques in real-time – yet these critical capabilities remain trapped in separate institutional silos

Foreign state actors weaponized coordinated TikTok networks to boost an obscure far-right candidate to first-round victory in Romania's 2024 presidential election, forcing authorities to annul the results – a first in EU history. Adversary campaigns achieve their goals before traditional policy processes can even identify threats, let alone respond.

The result is a ‘tragedy of the commons’ where no single entity is fully accountable for national cognitive security. The recently established cross-government Defending Democracy Taskforce, and the formation of a standing unit within the Department for Science, Innovation and Technology (formerly this unit was within the Department for Culture Media and Sport), is a recognition that there is an issue to be addressed, but they lack the capabilities of a well-resourced, permanent, single National Agency to be at the centre of any end-to-end 'whole-of-society' response.

From Secret State to Protective State: The NCSC Precedent

The NCSC's creation represented more than institutional reorganization – it embodied a fundamental shift in how the UK approaches nation-state threats. The innovation was to transform secret government expertise into operational knowledge shared with the public and private sectors, creating a far-reaching transition from 'secret state' to 'protective state'.

Previously, knowledge about state-sponsored cyber capabilities remained locked within classified intelligence channels. This transition proved revolutionary, forging deep partnerships across the entire cybersecurity ecosystem that enabled rapid threat detection, mitigation and a wider ‘whole of society’ approach. Yet, we have not made this same transition for disinformation.

Intelligence agencies develop deep expertise in information warfare threats and nation-state actors, while social media platforms detect manipulation techniques in real-time – yet these critical capabilities remain trapped in separate institutional silos. Classified protocols compartmentalize government intelligence and national security requirements competing with corporate governance structures prevent seamless coordination of actionable insights between rival technology platforms and national security responders.

Subscribe to the RUSI Newsletter

Get a weekly round-up of the latest commentary and research straight into your inbox.

A National Disinformation Agency could replicate the NCSC model by creating the institutional mechanism to safely declassify and operationalize intelligence about information warfare adversaries and strengthen coordination with Five Eyes and NATO partners. It could also provide the essential partnerships across the whole information ecosystem – from social media platforms to news organizations and fact-checking services – enabling a whole-of-society response informed by the deepest available understanding of how these threats operate.

The Path Forward: International Precedents

The UK's allies are starting to hard-wire cognitive security as an operational imperative with institutional change; many recognise that information warfare often emanates from the same threat actors as other nation-state operations, leading them to expand existing cybersecurity mandates or establish dedicated agencies.

Ukraine provides the most high-profile example, establishing a dedicated Center for Countering Disinformation in 2021 that operates as an international hub for collecting and analysing information to enhance cognitive resilience and help partners mitigate disinformation technical infrastructure. This institutional response demonstrates the kind of centralised, dedicated capability that enables rapid identification, analysis and response to information threats – precisely what the UK's fragmented approach currently lacks.

France also established VIGINUM in the same year and has successfully mitigated major Russian disinformation networks, including the aforementioned "Portal Kombat" and AI-driven campaigns such as ‘Storm-1516’. In the United States, CISA is leading efforts and developing capabilities to counter foreign influence operations targeting critical infrastructure, recognizing the convergence between cyber and information warfare threats, though these capabilities have faced recent political challenges.

The Imperative for Cognitive Defence

A national agency would provide the clarity of responsibility currently lacking, ensuring a coherent national strategy and preventing the ‘tragedy of the commons’ where no single entity is fully accountable. By integrating intelligence, technical capabilities, private sector partnerships and public communication under one roof, it could create a formidable, agile and strategically coherent force.

Establishing such an agency would not be easy: protecting civil liberties, resolving institutional ownership and measuring success all demand careful solutions. Robust safeguards would be needed to protect fundamental freedoms and its mandate strictly limited to countering state-sponsored hostile information operations, explicitly distinguishing this from legitimate public discourse, political debate or critical speech. Leveraging the precedent set by the National Security Act's foreign interference offence, the agency could operate with transparency, accountability and independent oversight to prevent any government overreach or censorship.

Just as the NCSC revolutionized the UK's cyber security posture, a dedicated National Disinformation Agency is now imperative to protect the nation's cognitive resilience. The threats are actively shaping elections, undermining trust and influencing outcomes before conflicts reach any traditional battlefield. The time for a fragmented approach is over; the era of comprehensive cognitive defence must begin.

© RUSI, 2025.

The views expressed in this Commentary are the author's, and do not represent those of RUSI or any other institution.

For terms of use, see Website Terms and Conditions of Use.

Have an idea for a Commentary you'd like to write for us? Send a short pitch to commentaries@rusi.org and we'll get back to you if it fits into our research interests. View full guidelines for contributors.


WRITTEN BY

William Dixon

RUSI Associate Fellow, Cyber and Tech

View profile


Footnotes


Explore our related content