In the seminal Fourth Amendment/Privacy case of Katz v. United States, the U.S. Supreme Court held that a person had a privacy interest in the contents of a payphone (remember payphones?) conversation and that the government needed a court order to intercept the contents of the communication. The Court reiterated its position that “What a person knowingly exposes to the public, even in his own home or office, is not a subject of Fourth Amendment protection. But what he seeks to preserve as private, even in an area accessible to the public, may be constitutionally protected.”
For years, courts have struggled with questions about whether things are “knowingly exposed to the public.” Marijuana being grown in a backyard with a 40-foot fence? Public – as long as it can be seen from a helicopter with a telescope at a lawful altitude. Trash thrown out at the curb? Public and abandoned property. Same trash on the curtilage of the house? Private. The smell of drugs coming from a car or suitcase, able to be sniffed by a dog? Public. The same smell on the outside of a home? Private. The heat emanating from grow lights and measurable by an IR sensor? Private. Heat from the engine of a car? Public. It’s pretty much hit or miss.
But sometimes, “what someone knowingly exposes to the public” can be considered private because of the way the data is collected, or because of the sheer volume and intrusiveness of the “public” information. There’s a difference between running into someone on the street and recognizing them (public), collecting information on everyone walking down the street, and collecting information on everyone walking down the street and running them through facial recognition software and pattern analysis – even though the data collected is collected “in public.” To some extent, our expectations of privacy – and the law’s willingness to accept them as reasonable – are dictated by the technology we know about, and what we expect to be used. Also, there’s a difference between someone (like the government) being able to follow you on the street, them being able to know where you are at all times, and them being able to know where everybody is at all times.
Enter ALPRs. Automated License Plate Readers have been a boon to law enforcement agencies and a bane to privacy advocates. These cameras capture vehicle license plates, read them and collect data. They can be deployed in many different ways. A roving police car can capture ALPR data as they encounter cars driving or parked on the streets. Computers can simply check to see if that car has been reported stolen, and then the police can use that information either to arrest the operator or recover the vehicle. Cars can be automatically checked for valid or current registration, or the owner of the vehicle (which the Supreme Court has held can be assumed to be the operator, even if the operator is a different race, gender or age than the owner) can be checked to see if their driver’s license is current and valid, or whether there are any warrants out for their arrest. Indeed, when the police stop a car at the side of the road and demand a license, registration and proof of insurance, the odds are they already know these details before they approach the car.
ALPRs are also used by retail establishments, shopping malls, traffic analysis and others to determine traffic patterns, information about shoppers, and the movement of specific individuals. Data from “private” ALPRs can be voluntarily provided to the police, or compelled by subpoena, warrant, or administrative demand. Repo men use ALPRs to find and repossess delinquent cars, and bounty hunters and divorce attorneys can buy ALPR data to find those who have skipped bail or skipped child support. The company, FLOCK, which sells some of this technology, advertises how it can be used at short-range, long-range, with existing cameras, for private security and how the data can be collected and analyzed and shared. “What a person knowingly exposes to the public is not a subject of Fourth Amendment protection.”
FLOCK You
On May 10 of this year, a Circuit Court judge in Norfolk, Virginia considered the privacy implications of the ubiquitous use of ALPRs by police. In Commonwealth v. Bell, the court addressed the fact that Norfolk police in 2023 installed a network of 172 ALPRs throughout the city using the Flock tools. The cameras – which were motion activated, captured images of every vehicle that passed by, read and stored the image of the vehicle and its occupants, and kept a record of the license plate, owner, etc., for thirty days. The data and the results were accessible to any law enforcement entity in Virginia. The cameras were installed together with “Shot Spotter” gunshot audio detectors designed to listen for the sound of gunshots, and alert the police when such sounds are detected. Naturally, these cameras were installed in areas where the Norfolk police suspected there to be a crime. Don’t expect the residents of Oceanbrook or Meadowview to be captured on camera – think more Young Terrace or Huntersville.
Essentially, the government set up a surveillance system to capture millions of data points about people who were doing nothing wrong. There was no probable cause, no reasonable suspicion. Any car that passed a camera would trigger the Flock system, and create a record. It’s one thing to do a quick license check on a car you have pulled over, but another thing altogether to create a database of every place every person has been. Even if both use the same – publicly exposed – license plate data.
Ringing the Bell
After a robbery at a video game store in Chesapeake, Virginia, a witness observed a gray Dodge minivan with a specific license plate leaving the store. Police put that license plate into the FLOCK system, and stopped a car belonging to Jayvon Bell’s wife. When they later found and stopped the car, it was being driven by Bell, who was arrested for the robbery of that video game store and another in Norfolk. Bell challenged the warrantless collection of the ALPR data, as well as the warrantless inquiry of the database created by the system.
The Circuit Court suppressed the evidence. At the outset, the court likened the ALPR dragnet to an aerial surveillance network challenged in the City of Baltimore. In that case, the City of Baltimore created a program called “AIR” which used aerial photography to track the movements of people in public spaces. As the court described it, “Multiple planes fly distinct orbits above Baltimore, equipped with PSS’s camera technology known as the “Hawkeye Wide Area Imaging System.” The cameras capture roughly 32 square miles per image per second.”
The planes transmit their photographs to PSS “ground stations” where contractors use the data to “track individuals and vehicles from a crime scene and extract information.”
Similarly, in United States v. Jones, 565 U.S. 400 (2012) the Supreme Court found a system that installed tracking devices on cars (and then monitored those devices) without a warrant to be unconstitutional – but based that on the property rights invaded when the tracking device was installed and not so much on the privacy right invaded when vehicles were tracked. In Carpenter v. United States, the Supreme Court extended this to require the government to obtain a search warrant to obtain cell tower location data about cell phones, recognizing that location data (of phones and therefore people) is sensitive information. Why should location data (of cars and therefore people) be any less sensitive?
The Norfolk judge also expressed concern that the Flock system had been deployed without any real debate, and no evidence presented in court about how it worked. No custodian of records described the database. Just a cop saying he “pinged” the black box. As the judge explained, “National, state, and local governments can use that information for a variety of administrative purposes and to help apprehend dangerous criminals. But knowledge is power, and power can be abused .” Finally, the Court observed that “the citizens of Norfolk may be concerned to learn the extent to which the Norfolk Police Department is tracking and maintaining a database of their every movement for 30 days. The Defendant argues “what we have is a dragnet over the entire city” retained for a month and the Court agrees.”
The Court distinguished a Massachusetts case accepting ALPRs on a single bridge from Norfolk’s dragnet of cameras.
While not mentioned particularly, the case also has implications for the next generation of AI technologies, which depend upon massive databases — and privacy-impacting data collection practices — to be “effective.” Individual bits of data may not implicate privacy, but comprehensive data collection (or analysis) may. The fact that I went to a website might not be private, (depends on the website) but my entire search history might very well be. Challenges to privacy related to volume, sensitivity and duration.
At the end of the day, the court skipped the whole “public space” and “private space” debate and simply decided that the practice of surveilling an entire community was, well, creepy. I’m sure that’s in the Constitution somewhere.
Recent Articles By Author