The neighbourhood had seen a rise in stop-and-search incidents. Young people, mostly Black and Asian, reported being questioned repeatedly; often without clear cause. One officer, newly assigned to the area, explained that the patrol routes were generated by a predictive system. “The map said go there,” he said. “It highlights hotspots based on past data.”
A community organiser asked how the system accounted for bias in historical records. The officer shrugged. “I don’t know. We just follow the zones.”
One evening, a teenager was stopped for the third time that month. He asked why. The officer pointed to the tablet. “It flagged this street,” he said. “It’s not personal.”
The teenager said, “It feels personal.”
Later, at a public meeting, residents asked for the system’s criteria. No one could explain. The data was proprietary. The thresholds were confidential. The delegation was complete.
The author records this moment as a civic distortion; a system designed to optimise safety had reinforced surveillance. A decision made by algorithm had displaced discretion and the role of the officer was present, but not accountable and the citizen was visible, but not recognised.
Vignette: The Bot Didn’t Flag It - NHS
She had been trying to book an appointment for three weeks. The online triage system asked her to describe her symptoms. She typed: “Persistent chest pain, shortness of breath, fatigue.” The bot returned a message: “Your symptoms do not meet the criteria for urgent care. Please try again later or contact 111.”
She tried again the next day, adding more detail. The same response, she decided to call the surgery. The receptionist said, “We can’t override the system, as it didn’t flag you.”
She asked if a doctor could review her notes and the receptionist paused. “There’s no manual review anymore, everything goes through the bot.”
She waited another week and the pain worsened. Eventually, she went to A&E and the doctor told her, “You should have been seen days ago.”
The author records this moment as a failure of presence. A system designed to manage demand had displaced judgment. A patient had been unseen, not because no one cared, but because no one was allowed to intervene. The bot didn’t flag it. So no one did.