The need for pro-democracy technology: some lessons from the USA

This is the opening text from a talk given by Anna Dent and Peter Wells at the start of an event on pro-democracy public sector technology on 18 June 2025.
We have all seen headlines and reporting about how political leaders are using public sector technology in undemocratic and harmful ways. Taking practices and tools which seem harmless and repurposing them for deeply harmful reasons, or using completely ill-suited AI for complex tasks with serious societal impacts.
What seems neutral and useful today might have a totally different set of impacts as social norms, legislation, or leadership shift and change. Those of us who design and build public sector technology might be unwittingly creating conditions for undemocratic practices.
In June, we held an event to bring together technology, digital and data professionals to explore how to design and future-proof the UK's public sector technology so that it is pro-democracy.
We started with a short presentation that used the current US government’s DOGE organisation to explore the potential issues, and then split into breakout rooms to discuss the UK’s risks and potential mitigations.
The presentation covered a range of topics:
- How US DOGE is failing to meet its stated ambition of efficiency, but succeeding in both shrinking the state and harming the people who rely on it
- How some of its work was fundamentally flawed, particularly when compared to techniques that we know can successfully transform government and public services
- How some people in the UK are trying to create their own DOGEs
- And the need to retain democratic control and accountability over other public sector technology and organisations
The USA's DOGE has failed, or has it?

One of us was on a call recently with someone who was let go from USAID, the US government aid agency that was effectively shut down by DOGE. His view, backed up by a lot of evidence, is that DOGE failed entirely on its stated aims of savings and efficiency, and the only place it succeeded was in terrorising and demoralising the government workforce.
But perhaps these stated aims were not actually DOGE’s real ambitions. Rather than focusing on identifying waste and fraud it has been about shrinking the state, sometimes for ideological reasons and sometimes seemingly randomly; demonstrating their ability to act with impunity; embedding unelected and unaccountable people inside government; and unlocking vast amounts of data for who knows what purposes.
The Supreme Court recently allowed DOGE continued access to the social security data of millions of Americans despite ongoing legal challenges. Normally a small handful of federal employees have access to this deeply personal information; now that DOGE has it the implications are potentially life-changing.
The opposite of efficiency savings

DOGE’s work on social security so far epitomises the reality of their actions versus the claimed efficiency goals.
False information about the levels of fraud and apparent payments to millions of dead Americans were used as justification for the large-scale takeover of social security leadership and access to data. When officials tried to correct the record the White House told them not to.
People worried by what DOGE are doing have been making welfare claims earlier than they would have otherwise, flooding phone lines and field offices with applications and requests for help.
Thousands of migrants have been added to a list of apparently dead social security claimants to ‘encourage them to self-deport’ and the head of social security agreed to share nearly 100,000 addresses of claimants with Immigration and Customs Enforcement.
A fundamentally flawed approach
It doesn’t feel like DOGE was really about efficiency or savings or modernisation in any real sense. Instead, they grabbed some headlines and shored up support from those who think the government is bloated and useless. Even a cursory glance at how they went about things shows how un-serious it was.
Take the example of the Department of Veterans Affairs. Donald Trump said all contracts should be reviewed within 30 days. The DOGE team used an off-the-shelf genAI model, gave it a shoddy prompt and told it to cancel anything that wasn’t directly related to patient care.
The tool recommended cancelling the department’s contract for internet access.
Just imagine if rather than an off-the-shelf genAI model that had been an off-the-shelf AI agent with the ability to rapidly act on the recommendations. Would the AI agent have cancelled the internet connection it needed to cancel more contracts?
How could someone even get access to write and run such shoddy code?
This DOGE model of fast, scrappy hacks with no interest in the things that might break along the way is in total opposition to what we know works from the UK, and many other countries around the world, who have been using technology and modern ways of working to deliver better public services.
In those places we see diligent work by multidisciplinary teams focussed on improving services for people. Teams that take a broader more systemic view of the desired outcomes than ‘efficiency’ and both have the expertise and take the time to sufficiently understand problems before acting.
But while this kind of reform can be effective it takes time and costs money.
Some people forget about DOGE’s other motives. They think only about the claims of ‘efficiency’ and reckon they can take a shortcut to reinvent government.
DOGE in the UK
So, the UK now has advocates for a similar approach to the US DOGE.
These advocates are of variable credibility, and many will not be able to do much more than make a noise, but sadly DOGE has entered the discourse as something worthy of serious debate.
So we need to take the risk of a future political leader trying to use the US DOGE approach in the UK seriously. Some have tried.
UK political party Reform created its own version of a DOGE team and sent it into Kent county council to look for efficiencies. The team immediately gained headlines for rapidly learning what anyone who works in the sector could have told them: most local authorities run on old, outsourced IT that costs more money and delivers worse public services than more modern approaches that could be delivered by skilled internal teams.
Next, they will realise that local authority funding is 18% lower in real terms per person than it was at the start of the 2010s while their responsibilities have only increased. It’s incredibly hard to deliver services, let alone redesign them, under that kind of pressure.
What if someone created a DOGE in the UK’s national government?
Even if a UK version of DOGE didn’t act as recklessly as the US, a great deal of harm could be done under the guise of efficiency, cutting red tape and using technology to rapidly deliver things. Regardless of whether those things are lawful or not.
In the UK we have already seen that many of our institutions run on trust and are vulnerable to people that don't care about the rules, especially the unwritten ones. Just ask anyone affected by the Post Office scandal and who is still waiting for a relatively slow-moving justice system to catch up to a simultaneous misuse of both technology and the ability to bring private prosecutions.
What could a UK DOGE do if it controlled the various national government digital platforms? The English NHS app and Federated Data Platform? Or organisations like the Office of National Statistics?
Because while digital platforms and infrastructure can increase efficiency and make it much easier for more people to build great public services, they can also make it much easier for a small number of people to break democracy and break the law.
Harming both those services and the people who rely on them.
This is not just about DOGE
The Sycamore Collective is not just here off the back of DOGE. Their acts have been triggers for mounting concern, but the idea of democratic tech and the need for democratic values to be prioritised in the deployment of public sector technology has not just emerged this year.
In some ways DOGE is not our biggest problem. It’s so visible and pantomime-ish that it can be monitored and its claims and ideas easily picked apart. Perhaps UK DOGE teams will be a damp squib?
Just as troubling, perhaps more so because it is more under the radar, is the likes of Palantir becoming deeply embedded in the UK’s public services. Not just as a straightforward contractor, but shaping policy and how we function as a nation.
Where is the democratic accountability for that? Are we comfortable with a company that sponsored a military parade for President Trump’s birthday also being intrinsic to the NHS?
And that’s before we even think about the new AI tools which are currently being built around generative AI models created and hosted by large overseas firms.
What risks are being created by these and other tools? Who could end up controlling those new sources of power that could underpin so much of the state and the UK’s democracy?
And how do we make sure that we retain democratic control across the UK’s four nations and in local government?
Exploring questions like these is why we organised this event.