On Tuesday 4 November 2025, PICTFOR hosted a roundtable bringing together Parliamentarians, policymakers, and industry representatives to examine the relationship between cybersecurity and democracy. The discussion explored the growing risks posed by state-backed interference, misinformation, and social-media manipulation, and how these trends threaten electoral integrity and public confidence. Participants also considered the importance of transparency, regulation, and public education in building democratic resilience, as well as the practical steps needed to strengthen cyber defences across government, political parties, and civil society.
Baroness Neville-Rolfe, PICTFOR Treasurer, Shadow Minister (Treasury)
Baroness Neville-Rolfe opened the meeting by welcoming attendees and highlighting the growing convergence of digital, cyber, and geopolitical risks. She noted that ransomware, AI-generated misinformation, and emerging technologies such as quantum computing now sit at the top of both business and government agendas.
Introducing the panel, she welcomed Sharon Hodgson MP, Leader of the UK Delegation to the OSCE Parliamentary Assembly; Dr Ben Spencer MP, Shadow Minister for Science, Innovation and Technology; and a representative at the National Cyber Security Centre (NCSC).
Sharon Hodgson Member of Parliament for Washington and Sunderland West, Leader of the UK Delegation to the OSCE Parliamentary Assembly
Sharon Hodgson MP outlined the OSCE’s work monitoring election integrity across Europe and reported on Baroness Winterton’s recent observation mission to Moldova. She described evidence of Russian interference through coordinated online propaganda, fake social-media accounts, and the weaponisation of energy supplies to destabilise pro-European institutions.
She warned that digital misinformation now represents a global democratic threat, with organised manipulation campaigns identified in 81 countries. She cited polling showing 58 % of US adults expect AI to worsen misinformation during future elections.
Hodgson stressed that democracy must be protected through cross-border resilience, better tracking of covert financing, and public education to help citizens identify manipulation. She confirmed that the UK Delegation will next observe the Kyrgyzstan elections and that OSCE monitoring of Russian-linked interference remains a priority.
Dr Ben Spencer Member of Parliament for Runnymede and Weybridge, Shadow Minister for Science, Innovation and Technology
Dr Spencer MP argued that the most effective weapon hostile actors have deployed is to “make us stop believing ourselves.” He distinguished between direct cyberattacks (such as ransomware) and indirect manipulation (via social-media messaging).
He warned that the UK had been “lucky” not to experience fabricated election material at scale but that the risk is growing. Solutions, he said, include:
- Digital watermarking to verify provenance and authenticity of online content.
- Transparency of social-media algorithms so users can see why information is promoted and by whom.
- Greater clarity on who users are engaging with – humans or automated bots.
Turning to cyber resilience, Dr Spencer noted that legislation such as the Cyber Security and Resilience Bill must contain precise definitions to drive cultural change rather than box-ticking. He also warned that the UK cloud market is dangerously concentrated in AWS, Microsoft, and Google, creating systemic risk when outages occur. He called for a functioning cloud marketplace that enables informed, secure choices and reduces reliance on a few large providers.
NCSC Representative
The NCSC representative outlined the NCSC’s role in supporting democracy through four lenses: infrastructure, institutions, individuals, and information.
- Infrastructure: The NCSC works with local councils and election suppliers as registration systems migrate to cloud platforms. The 2021–22 Electoral Commission cyberattack, attributed to a Chinese state actor, underlined the threat.
- Institutions: Political parties and internal leadership contests are prime targets. The NCSC provides early-warning services to alert them to suspicious activity.
- Individuals: Candidates face growing risks from state-backed and commercial intrusion markets. The UK is leading the Pall Mall Process to curb the trade in commercial hacking tools.
- Information: The NCSC is not an arbiter of truth but works with industry and civil-society organisations on technical authenticity tools such as watermarking.
They confirmed that the NCSC will pilot enhanced security services for the Welsh, Scottish, and regional elections in 2026, ahead of the next UK general election in 2029.
The discussion then opened to the floor.
Damian Hinds MP asked about the scale of interference in UK elections. The NCSC representative replied that the threat is increasing, particularly through divisive misinformation and disinformation. AI amplifies both the problem and the potential for detection, he said, adding that the UK’s paper-based voting system remains an important safeguard.
Baroness Berger warned of the growing influence of paid online influencers spreading toxic or extreme content below the criminal threshold. Algorithms, she said, reward such material, creating a daily “drip-feed” that corrodes democratic trust.
Sir John Whittingdale MP noted that the Foreign Affairs Committee (of which he is a member) has documented Russian use of TikTok and Telegram to spread disinformation. He emphasised the need to focus less on removing content and more on preventing initial amplification through stronger platform controls.
The NCSC representative responded that while the NCSC can advise technically, policy interventions on amplification and influencer activity sit with the Department for Science, Innovation and Technology.
Sharon Hodgson MP emphasised the importance of public education and digital literacy, stressing that education campaigns should focus on helping people question and critically assess online content rather than believing everything they see. She suggested that policymakers should be asking these kinds of questions to guide how such educational initiatives are designed and delivered.
Dr Ben Spencer MP welcomed the focus on influencers and reiterated that while algorithms themselves are not inherently harmful, there is a need for much greater transparency around how they operate. He cautioned that companies with market dominance can alter algorithms without notice, particularly in unregulated contexts, which undermines accountability.
Dr Hugo Leal, Research Associate at the Minderoo Centre for Technology and Democracy, highlighted the importance of addressing scale and context rather than isolated incidents. Effective countermeasures must act on network dynamics — moderating context (posts and information being shared widely) rather than content (multiple similar posts with minor reach) — to preserve free speech while protecting information integrity. He warned that researchers’ access to data is being curtailed by major technology firms, impeding evidence-based policymaking. Leal explained that disinformation spreads through complex contagion, where multiple exposures drive belief. Identifying and disrupting amplification networks is therefore more effective than censoring individual posts.
Stephen West, Secure Government & Critical Infrastructure Director at Zaizi, noted that government adoption of emerging technology, especially in deepfake detection, has improved but remains slow and fragmented. SMEs face major procurement barriers, limiting innovation and agility.
Professor James Davenport, Vice-Chair of the CPHC, cautioned that watermarking only works if platforms preserve and display provenance marks. Some companies remove these identifiers, so regulation may be needed to mandate their retention.
Jonathan Stevenson-Paul, Estates Manager, Arqiva, advocated blending new and traditional educational methods to improve digital literacy. Platforms should integrate prompts encouraging users to verify sources before sharing.
John Whittingdale MP concluded that the most reliable defence against disinformation is independent, professional journalism. Citizens must know which outlets can be trusted. He warned that journalism is under pressure from AI summaries that divert traffic away from news sites, undermining the sector that ensures factual accuracy.
Baroness Neville-Rolfe closed the discussion by noting the breadth of the challenges identified – from cyber resilience and watermarking to influencers, algorithms and media trust. She stressed the need for co-ordination across government departments and clearer ownership of responsibilities to prevent fragmentation.
This is not an official publication of the House of Commons or the House of Lords. It has not been approved by either House or its committees. All-Party Parliamentary Groups are informal groups of Members of both Houses with a common interest in particular issues. The views expressed in this report are those of the group.