



Rules written for clear measurement harden into procedures that exclude discretionary judgement. Repeated reliance on automated recommendations forms habits of deference, eroding the collective capacity to question or intervene.
These mechanisms produce predictable civic effects across British public life. Context flattens under standardised inputs. Interventions respond to proxy signals rather than root causes. Trust erodes where communities experience mechanical, unexplained decisions. Biases embedded in models amplify inequality at scale. Skills in judgement, empathy, and deliberation atrophy from disuse, while institutions celebrate throughput and compliance.
Concrete UK examples
Healthcare: An NHS triage algorithm prioritises referrals based on coded symptoms, delaying specialist review for patients with atypical presentations.
Welfare: A Universal Credit eligibility filter, driven by narrow income feeds, issues sanctions before a claimant’s temporary circumstances are recorded.
Policing: A predictive deployment model directs patrols to flagged areas, reinforcing stop-and-search patterns without addressing underlying social needs.
Employment: Automated CV-screening tools exclude candidates with non-linear careers, disadvantaging carers and returners to work.