Mr. Latte


The Privacy Paradox: Why Everett Shut Down Its AI Camera Network Over a Public Records Ruling

TL;DR The city of Everett completely shut down its Flock Safety automated license plate reader (ALPR) network after a judge ruled the footage is a public record. Faced with the reality that anyone could legally request mass surveillance data, the city chose to abandon the system rather than compromise citizen privacy. This highlights a critical clash between government transparency laws and modern mass-data collection technologies.


Automated License Plate Readers (ALPRs) like those made by Flock Safety have become popular tools for local law enforcement to track vehicles and solve crimes. However, these systems operate in a legal gray area regarding who actually owns and can access the massive amounts of data they collect. Recently, this tension reached a boiling point in Everett, where a judge ruled that Flock camera footage falls under public records laws. This ruling instantly transformed a proprietary police tool into an open database of citizen movements, forcing the city to make a drastic decision.

Key Points

The core issue stems from the clash between sunshine laws, which are designed to keep government transparent, and mass surveillance technology. When the judge ruled that Flock camera footage is a public record, it meant any citizen, journalist, or bad actor could potentially request the tracking data of specific vehicles. Realizing that fulfilling these requests would effectively weaponize the system against the general public’s privacy, the city of Everett opted to shut down the entire network. The city recognized that the administrative burden of redacting innocent citizens’ data, combined with severe privacy liabilities, made the system impossible to maintain. This exposes a fundamental flaw in deploying black-box surveillance tech without clear legislative boundaries.

Technical Insights

From a software engineering perspective, this scenario is a classic failure of ‘Privacy by Design’ and data governance. Flock systems are optimized for rapid data ingestion, searchability by police, and short-term retention, but they lack automated, cost-effective tooling for the mass redaction required by Freedom of Information Act (FOIA) requests. When we build systems that aggregate personally identifiable information, the architecture usually assumes strict role-based access control bounded by the organization. If an external legal ruling suddenly alters the threat model—shifting the data from internal-only to publicly accessible—the system’s technical tradeoffs, like storing raw images instead of instantly blurring non-target data at the edge, become fatal flaws.

Implications

For developers and tech companies building GovTech or enterprise data systems, this is a wake-up call to treat legal compliance and data exposure as core architectural constraints. We can no longer just build the ‘happy path’ for the intended user; we must engineer safeguards for worst-case data exposure scenarios. This ruling may force ALPR vendors to fundamentally re-architect their platforms to include edge-computing redaction, zero-knowledge proofs, or cryptographic access logs to survive in states with strong public records laws.


As AI and mass data collection become cheaper, the line between public transparency and individual privacy will only get blurrier. How should we architect systems that balance the public’s right to know with the right to be left alone? Moving forward, engineers must ask themselves not just ‘can we collect this data?’, but ‘what happens if we are legally forced to share it?’

Read Original

Collaboration & Support Get in touch →