Britain joins a global rise in AI policing as fears grow over a future that watches… and predicts
A Fiction That Became Blueprint
Once, it was just television.
In Person of Interest, the reclusive genius Harold Finch built a system that could predict acts of violence before they happened. A digital oracle watching the world through cameras and code.
It felt like science fiction.Now it reads more like a leaked memo from the near future.

Britain’s Watching Eyes
Across the UK, police forces have quietly begun deploying live facial recognition technology in public spaces.
Cameras scan crowds. Faces are matched against watchlists in seconds. Decisions unfold in real time.
Supporters call it efficiency. Critics call it something colder: mass surveillance by algorithm.
Hundreds of arrests have already been linked to the technology. But errors have surfaced too. Wrong faces. Wrong people. Real consequences.

Because when a machine makes a mistake, it doesn’t apologise. It simply moves on.
The Rise of Predictive Policing
Beyond the cameras lies something even more unsettling.
AI systems are now being used across the UK, the United States, and Europe to predict where crime may happen next. Some go further still, analysing individuals and assigning “risk scores.”
Not crimes committed. Crimes expected.
The promise is seductive: fewer victims, smarter policing, safer streets.
But the danger lurks in the data. These systems learn from the past and the past, as history repeatedly proves, is rarely fair.
Bias in. Bias out.
Only now, it wears the mask of objectivity.
China’s Fully Wired State
If Britain is testing the waters, China has already dived in.
Vast networks of AI-powered cameras track citizens across cities. Movements, behaviours, even associations are monitored and analysed in real time.
It is surveillance at scale. Industrial, seamless, and deeply embedded.
There, the line between security and control has already begun to blur.
A World Quietly Converging
Different nations. Different laws. Different limits but the trajectory is unmistakable.
From London to Beijing to Washington, artificial intelligence is becoming the silent partner in policing. Faster than any officer. Tireless, unblinking, increasingly trusted and increasingly powerful.
The Question We Can’t Ignore
Harold Finch feared his Machine not because it worked but because it worked too well.
Today, there is no single Machine. No solitary inventor wrestling with its conscience. Instead, there are fragments everywhere, stitched into daily life, humming beneath the surface.
And the question is no longer fiction.
If a machine can predict what you might do…
How long before it decides what you deserve?

“THE MACHINE HAS MANY NAMES”
From Pegasus to predictive policing, the real systems shaping a surveillance age
The Invisible Arsenal
Forget a single all-seeing AI. The modern “Machine” is a patchwork empire of tools, databases, and algorithms, each doing one small part of the job Harold Finch once imagined.
Individually, they seem manageable.
Together, they begin to feel… orchestral.
Here are some of the most powerful instruments in that symphony.
Pegasus: The Spy in Your Pocket
– Pegasus
Developed by Israel’s NSO Group, Pegasus is perhaps the most infamous surveillance tool in the world. The software infects smartphones without even the user’s interaction and gains access to messages, camera, microphone, location. It’s been covertly used by governments to monitor journalists, activists, and suspects. Althoigh it doesn’t predict crime. It erases privacy entirely.
If Finch’s Machine listened to the world, Pegasus whispers directly into your life.
HOLMES2: Britain’s Digital Casebook
– HOLMES2
Used by UK police and the National Crime Agency, HOLMES2 is less cinematic but deeply powerful. Centralises evidence, witness statements, and intelligence
Links people, events, and timelines across vast investigations and helps detect patterns that human investigators might miss.
Although not predictive in itself, but it lays the data foundation for prediction.
Think of it as the memory of the Machine.
Live Facial Recognition (UK Policing Systems)
Facial Recognition Technology
Deployed by forces such as the Metropolitan Police scans faces in real time in public spaces it then matches against watchlists of suspected criminals.
It triggers alerts for officers on the ground
and is fast,efficient but still controversial.
Because a face is no longer just a face it’s a data point in motion.
PredPol / Geolitica: Predicting the Next Crime
PredPol (now rebranded as Geolitica)
Uses historical crime data to forecast high-risk locations. The system generates patrol “heat maps” it’s claimed to reduce crime through targeted policing, however some critics argue it risks reinforcing systemic bias.
The Machine doesn’t need to accuse you.
It just needs to send police to your street more often.
Palantir Gotham: The Data Integrator
Used by intelligence agencies and police forces across the West:
Combines data from multiple sources into a single platform, it maps relationships between people, places, and events. This
enables investigators to uncover hidden networks. If Pegasus is the ear, Gotham is the brain stitching the story together.
EternalBlue: The Weapon That Escaped
EternalBlue
Originally developed by the NSA as a cyberweapon, the malware exploited vulnerabilities in Microsoft systems. EternalBlue was leaked in 2017 and used in global cyberattacks like WannaCry, one of the most devastating trojan software.
EternalBlue demonstrated how state tools can spiral beyond control which is the darker edge of ‘the Machine’.
Not watching… but breaking things open.
China’s “Skynet” and “Sharp Eyes”
– Skynet amd Sharp Eyes
Nationwide camera networks with AI facial recognition, it provides real-time tracking of individuals across citiesamd is integrated with Chinese law enforcement databases, its scale is huge and that’s its main biggest advantage as it’s not just a tool or even system – it’s an ecosystem of observation.
The Quiet Convergence:
None of these systems alone is “The Machine.”
But combined they show
– Pegasus’s access
– Facial recognition’s vision
– Palantir’s analysis
– Predictive policing’s foresight.
The Final Line
No single person controls this.
No single switch turns it off.
And no single conscience guides it.
Just systems talking to systems,
watching, learning, predicting…
while the rest of us carry on,
unaware of how much of ourselves is already readable.
