Commentary

The Future Surveillance Dystopia

“The eyes of the state can be on you at all times”

The United Kingdom is currently overseeing the testing and gradual deployment of a suite of technologies designed to monitor, analyse, and predict the behaviour of its citizens in ways that were previously confined to dystopian fiction. The scale of these systems, and more importantly the philosophy underpinning their use, should concern anyone regardless of political persuasion or criminal intent.

These technologies will not remain confined to the UK. There is a long and well-established precedent of individual states operating as de facto pilot studies, trialling new forms of surveillance and control before they are adopted elsewhere. Controversial technologies trialled in one country today often become conventional practice globally tomorrow.

Become a Free Member

Enjoy independent, ad-free journalism - delivered to your inbox each week

The UK itself provides clear historical examples. In the 1990s, Britain became the most surveilled democracy on earth, with the highest density of CCTV cameras anywhere in the world. This set a template that other Western nations soon followed. Britain remains one of the most surveilled nations on earth, but has since been surpassed by both China and the United States. The same pattern can be seen with Automatic Number Plate Recognition technology, which uses optical character recognition to read vehicle registration plates. It was first extensively deployed in the UK and is now embedded across policing, border control, and private security throughout the Western world. This suggests that technological innovations deemed successful are likely to expand far beyond the borders of their country of origin.

The most concerning recent development in UK surveillance is the use of AI to predict crime before it happens. According to reporting in The Telegraph, police chiefs in the UK are currently evaluating around 100 separate AI projects, with the Government investing £4 million in the creation of an interactive, AI-powered map of England and Wales intended to be fully operational by 2030. Its stated purpose is to identify areas likely to experience criminal activity and to recommend police intervention before any criminal activity actually takes place. Sir Andy Marsh, head of the College of Policing, has described plans to identify the 1,000 most “dangerous predatory men” believed to pose the highest risk to women and girls. These individuals would be flagged for crimes they are statistically likely to commit based on data and case histories. Marsh has stated openly that the aim is to make such men “frightened because the police are coming after them”.

There are several problems with this approach. The first is epistemic. Men willing to engage in highly taboo, predatory behaviour are already demonstrating a willingness to defy social norms and legal constraints. They are, by definition, less predictable than average citizens. Treating them as stable data points risks giving police a false sense of control over individuals who do not conform to statistical regularities to the same degree as the average citizen. Equally, if they are aware they are being observed it would make sense for them to alter their behavioural patterns to render the AI’s predictive abilities far less effective. This could lead to a situation in which, because they are being monitored by artificial intelligence rather than human police officers, they are able to commit crimes so long as they do not adhere to the prior patterns of criminality used by the AI to predict their behaviour in the first place.

The second problem is that monitoring these 1,000 predatory men is not as effective as other interventions. Predictive systems may create an illusion of containment while diverting attention from the reality that the most reliable way to prevent certain crimes is the physical removal of genuinely dangerous individuals from society. This could be either through imprisonment or, in more serious cases, capital punishment. The return of the latter is still supported by a majority of Britons. In the context of the current shortage of prison space in Britain, this will incentivise subsequent governments to release dangerous criminals into society, free to commit further crimes, but with the illusion of preventative measures in place.

This approach also further erodes the boundary between what is a crime and what is not. How do you deal with someone who is predicted to commit a crime but has yet to do so? Do you still treat them as a criminal? Until very recently, the police recorded “non-crime hate incidents”, which resulted in police intervention despite, by definition, no crime being committed. Being “guilty” of such an incident could result in police visiting one’s home or even a trip to the police station. These incidents also appeared on enhanced background checks despite no criminal conviction taking place. Non-crime hate incidents were only scrapped following a considerable campaign against them. Nevertheless, they support the view that individuals can be effectively criminalised for behaviour deemed problematic by the state despite no actual crime having occurred.

While efforts to protect women and girls are undeniably admirable, it would be naïve to assume these tools will remain narrowly confined to this domain if they are deemed successful. There are powerful institutional incentives to expand their application. Once a system exists to identify high-risk individuals, the definition of risk inevitably broadens. Political dissidents, protest movements, journalists, and activists all generate behavioural patterns that can be framed as disruptive. Within the increasingly dominant technocratic, data-driven governance model, it seems inevitable that such technology would eventually be employed in this manner.

If this were not concerning enough, it comes alongside plans to implement live facial recognition technology in every town centre across the country. Such systems allow police to identify individuals in real time, track their movements, and retrospectively reconstruct their behaviour. When combined with predictive analytics, this creates the technical foundations for continuous population monitoring regardless of criminality.

The political architect of much of this vision is the Home Secretary, Shabana Mahmood, who has described her ideal system as a “panopticon” in which “the eyes of the state can be on you at all times”. She recently articulated this vision to Tony Blair, a figure synonymous with the expansion of technocratic governance and surveillance powers, and received his stamp of approval.

The panopticon is a prison design that originated with the 18th-century philosopher Jeremy Bentham. It consists of a rotunda with cells arranged along the outer circumference across multiple levels. The side of each cell facing the centre of the circle has iron bars, while the outer side has a window, allowing light to pass through the cell and silhouette the prisoner, making them easily visible. At the centre is a watchtower from which a single guard can observe all prisoners without being seen himself. Although one guard cannot physically monitor every prisoner at once, prisoners can never know when they are being watched and are therefore compelled to behave as though they always are. It is effectively a prison designed to induce self-regulation through uncertainty.

Applied at the level of an entire population, the implications are deeply sinister. A panoptic surveillance state goes far beyond punishing dissent. It prevents it from forming in the first place. People moderate their behaviour, speech, and associations long before the state needs to intervene. From the perspective of a potentially tyrannical government, this is the most effective technique for maximising obedience among its subjects.

Functionally, a future surveillance dystopia emerges incrementally through pilot projects such as these, using the framing that they are well-intentioned attempts to target criminals. Much like a parasite, by the time it is recognised as a problem in the body politic, it is already deeply embedded and difficult to remove. Unless current trends are disrupted we may well find ourselves living inside our very own panopticon, forever watched by unseen authorities, despite doing nothing wrong.

Donate today

Help Ensure our Survival

Comments (0)

Want to join the conversation?

Only supporting or founding members can comment on our articles.