Let’s start with Flock, the company behind a number of automated license plate reader (ALPR) and other camera technologies. You might be surprised at how many Flock cameras there are in your community. Many large and small municipalities around the country have signed deals with Flock for license plate readers to track the movement of all cars in their city. Even though these deals are signed by local police departments, oftentimes ICE also gains access.
Because of their ubiquity, people are interested in finding out where and how many Flock cameras are in their community. One project that can help with this is the OUI-SPY, a small piece of open source hardware. The OUI-SPY runs on a cheap Arduino compatible chip called an ESP-32. There are multiple programs available for loading on the chip, such as “Flock You,” which allows people to detect Flock cameras and “Sky-Spy” to detect overhead drones. There’s also “BLE Detect,” which detects various Bluetooth signals including ones from Axon, Meta’s Ray-Bans that secretly record you, and more. It also has a mode commonly known as “fox hunting” to track down a specific device. Activists and researchers can use this tool to map out different technologies and quantify the spread of surveillance.
There’s also the open source Wigle app which is primarily designed for mapping out Wi-Fi, but also has the ability to make an audio alert when a specific Wi-Fi or Bluetooth identifier is detected. This means you can set it up to get a notification when it detects products from Flock, Axon, or other nasties in their vicinity.
One enterprising YouTuber, Benn Jordan, figured out a way to fool Flock cameras into not recording his license plate simply by painting some minor visual noise on his license plate. This is innocuous enough that any human will still be able to read his license plate, but it completely prevented Flock devices from recognizing his license plate as a license plate at the time. Some states have outlawed drivers obscuring their license plates, so taking such action is not recommended.
Jordan later went on to discover hundreds of misconfigured Flock cameras that were exposing their administrator interface without a password on the public internet. This would allow anyone with an internet connection to view a live surveillance feed, download 30 days of video, view logs, and more. The cameras pointed at parks, public trails, busy intersections, and even a playground. This was a massive breach of public trust and a huge mistake for a company that claims to be working for public safety.
Other hackers have taken on the task of open-source intelligence and community reporting. One interesting example is deflock.me and alpr.watch, which are crowdsourced maps of ALPR cameras. Much like the OUI-SPY project, this allows activists to map out and expose Flock surveillance cameras in their community.
Another interesting project documenting ICE and creating a trove of open-source intelligence is ICE List Wiki which contains info on companies that have contracts with ICE, incidents and encounters with ICE, and vehicles ICE uses.
People without programming knowledge can also get involved. In Chicago, people used whistles to warn their neighbors that ICE was present or in the area. Many people 3D-printed whistles along with instructional booklets to hand out to their communities, allowing a wider distribution of whistles and consequently earlier warnings for their neighbors.
There is also EFF’s own Rayhunter project for detecting cell-site simulators, about which we have written extensively. Rayhunter runs on a cheap mobile hotspot and doesn’t require deep technical knowledge to use.
It’s important to remember that we are not powerless. Even in the face of a domestic law enforcement presence with massive surveillance capabilities and military-esque technologies, there are still ways to engage in surveillance self-defense. We cannot give into nihilism and fear. We must continue to find small ways to protect ourselves and our communities, and when we can, fight back.
EFF is not affiliated with any of these projects (other than Rayhunter) and does not endorse them. We don’t make any statements about the legality of using any of these projects. Please consult with an attorney to determine what risks there may be.
It’s no secret that 2025 has givenAmericansplentytoprotestabout. But as news cameras showed protesters filling streets of cities across the country, law enforcement officers—including U.S. Border Patrol agents—were quietly watching those same streets through different lenses: Flock Safety automated license plate readers (ALPRs) that tracked every passing car.
Through an analysis of 10 months of nationwide searches on Flock Safety’s servers, we discovered that more than 50 federal, state, and local agencies ran hundreds of searches through Flock’s national network of surveillance data in connection with protest activity. In some cases, law enforcement specifically targeted known activist groups, demonstrating how mass surveillance technology increasingly threatens our freedom to demonstrate.
Flock Safety provides ALPR technology to thousands of law enforcement agencies. The company installs cameras throughout their jurisdictions, and these cameras photograph every car that passes, documenting the license plate, color, make, model and other distinguishing characteristics. This data is paired with time and location, and uploaded to a massive searchable database. Flock Safety encourages agencies to share the data they collect broadly with other agencies across the country. It is common for an agency to search thousands of networks nationwide even when they don’t have reason to believe a targeted vehicle left the region.
Via public records requests, EFF obtained datasets representing more than 12 million searches logged by more than 3,900 agencies between December 2024 and October 2025. The data shows that agencies logged hundreds of searches related to the 50501 protests in February, the Hands Off protests in April, the No Kings protests in June and October, and other protests in between.
The Tulsa Police Department in Oklahoma was one of the most consistent users of Flock Safety’s ALPR system for investigating protests, logging at least 38 such searches. This included running searches that corresponded to a protest against deportation raids in February, a protest at Tulsa City Hall in support of pro-Palestinian activist Mahmoud Khalil in March, and the No Kings protest in June. During the most recent No Kings protests in mid-October, agencies such as the Lisle Police Department in Illinois, the Oro Valley Police Department in Arizona, and the Putnam County (Tenn.) Sheriff’s Office all ran protest-related searches.
While EFF and other civil liberties groups argue the law should require a search warrant for such searches, police are simply prompted to enter text into a “reason” field in the Flock Safety system. Usually this is only a few words–or even just one.
In these cases, that word was often just “protest.”
Crime does sometimes occur at protests, whether that’s property damage, pick-pocketing, or clashes between groups on opposite sides of a protest. Some of these searches may have been tied to an actual crime that occurred, even though in most cases officers did not articulate a criminal offense when running the search. But the truth is, the only reason an officer is able to even search for a suspect at a protest is because ALPRs collected data on every single person who attended the protest.
Search and Dissent
2025 was an unprecedented year of street action. In June and again in October, thousands across the country mobilized under the banner of the “No Kings” movement—marches against government overreach, surveillance, and corporate power. By some estimates, the October demonstrations ranked among the largest single-day protests in U.S. history, filling the streets from Washington, D.C., to Portland, OR.
EFF identified 19 agencies that logged dozens of searches associated with the No Kings protests in June and October 2025. In some cases the “No Kings” was explicitly used, while in others the term “protest” was used but coincided with the massive protests.
Law Enforcement Agencies that Ran Searches Corresponding with “No Kings” Rallies * Anaheim Police Department, Calif. * Arizona Department of Public Safety * Beaumont Police Department, Texas * Charleston Police Department, SC * Flagler County Sheriff’s Office, Fla. * Georgia State Patrol * Lisle Police Department, Ill. * Little Rock Police Department, Ark. * Marion Police Department, Ohio * Morristown Police Department, Tenn. * Oro Valley Police Department, Ariz. * Putnam County Sheriff’s Office, Tenn. * Richmond Police Department, Va. * Riverside County Sheriff’s Office, Calif. * Salinas Police Department, Calif. * San Bernardino County Sheriff’s Office, Calif. * Spartanburg Police Department, SC * Tempe Police Department, Ariz. * Tulsa Police Department, Okla. * US Border Patrol
For example:
In Washington state, the Spokane County Sheriff’s Office listed “no kings” as the reason for three searches on June 15, 2025 [Note: date corrected]. The agency queried 95 camera networks, looking for vehicles matching the description of “work van,” “bus” or “box truck.”
In Texas, the Beaumont Police Department ran six searches related to two vehicles on June 14, 2025, listing “KINGS DAY PROTEST” as the reason. The queries reached across 1,774 networks.
In California, the San Bernardino County Sheriff’s Office ran a single search for a vehicle across 711 networks, logging “no king” as the reason.
In Arizona, the Tempe Police Department made three searches for “ATL No Kings Protest” on June 15, 2025 searching through 425 networks. “ATL” is police code for “attempt to locate.” The agency appears to not have been looking for a particular plate, but for any red vehicle on the road during a certain time window.
But the No Kings protests weren’t the only demonstrations drawing law enforcement’s digital dragnet in 2025.
For example:
In Nevada’s state capital, the Carson City Sheriff’s Office ran three searches that correspond to the February 50501 Protests against DOGE and the Trump administration. The agency searched for two vehicles across 178 networks with “protest” as the reason.
In Florida, the Seminole County Sheriff’s Office logged “protest” for five searches that correspond to a local May Day rally.
In Alabama, the Homewood Police Department logged four searches in early July 2025 for three vehicles with “PROTEST CASE” and “PROTEST INV.” in the reason field. The searches, which probed 1,308 networks, correspond to protests against the police shooting of Jabari Peoples.
In Texas, the Lubbock Police Department ran two searches for a Tennessee license plate on March 15 that corresponds to a rally to highlight the mental health impact of immigration policies. The searches hit 5,966 networks, with the logged reason “protest veh.”
In Michigan, Grand Rapids Police Department ran five searches that corresponded with the Stand Up and Fight Back Rally in February. The searches hit roughly 650 networks, with the reason logged as “Protest.”
Someagencies have adopted policies that prohibit using ALPRs for monitoring activities protected by the First Amendment. Yet many officers probed the nationwide network with terms like “protest” without articulating an actual crime under investigation.
In a few cases, police were using Flock’s ALPR network to investigate threats made against attendees or incidents where motorists opposed to the protests drove their vehicle into crowds. For example, throughout June 2025, an Arizona Department of Public Safety officer logged three searches for “no kings rock threat,” and a Wichita (Kan.) Police Department officer logged 22 searches for various license plates under the reason “Crime Stoppers Tip of causing harm during protests.”
Even when law enforcement is specifically looking for vehicles engaged in potentially criminal behavior such as threatening protesters, it cannot be ignored that mass surveillance systems work by collecting data on everyone driving to or near a protest—not just those under suspicion.
Border Patrol’s Expanding Reach
As U.S. Border Patrol (USBP), ICE, and other federal agencies tasked with immigration enforcement have massively expanded operations into major cities, advocates for immigrants have responded through organized rallies, rapid-response confrontations, and extended presences at federal facilities.
USBP has made extensive use of Flock Safety’s system for immigration enforcement, but also to target those who object to its tactics. In June, a few days after the No Kings Protest, USBP ran three searches for a vehicle using the descriptor “Portland Riots.”
USBP also used the Flock Safety network to investigate a motorist who had “extended his middle finger” at Border Patrol vehicles that were transporting detainees. The motorist then allegedly drove in front of one of the vehicles and slowed down, forcing the Border Patrol vehicle to brake hard. An officer ran seven searches for his plate, citing “assault on agent” and “18 usc 111,” the federal criminal statute for assaulting, resisting or impeding a federal officer. The individual was charged in federal court in early August.
USBP had access to the Flock system during a trial period in the first half of 2025, but the company says it has since paused the agency’s access to the system. However, Border Patrol and other federal immigration authorities have been able to access the system’s data through local agencies who have run searches on their behalf or even lent them logins.
Targeting Animal Rights Activists
Law enforcement’s use of Flock’s ALPR network to surveil protesters isn’t limited to large-scale political demonstrations. Three agencies also used the system dozens of times to specifically target activists from Direct Action Everywhere (DxE), an animal-rights organization known for using civil disobedience tactics to expose conditions at factory farms.
Delaware State Police queried the Flock national network nine times in March 2025 related to DxE actions, logging reasons such as “DxE Protest Suspect Vehicle.” DxE advocates told EFF that these searches correspond to an investigation the organization undertook of a Mountaire Farms facility.
Additionally, the California Highway Patrol logged dozens of searches related to a “DXE Operation” throughout the day on May 27, 2025. The organization says this corresponds with an annual convening in California that typically ends in a direct action. Participants leave the event early in the morning, then drive across the state to a predetermined but previously undisclosed protest site. Also in May, the Merced County Sheriff’s Office in California logged two searches related to “DXE activity.”
As an organization engaged in direct activism, DxE has experienced criminalprosecution for its activities, and so the organization told EFF they were not surprised to learn they are under scrutiny from law enforcement, particularly considering how industrial farmers have collected and distributed their own intelligence to police.
The targeting of DxE activists reveals how ALPR surveillance extends beyond conventional and large-scale political protests to target groups engaged in activism that challenges powerful industries. For animal-rights activists, the knowledge that their vehicles are being tracked through a national surveillance network undeniably creates a chilling effect on their ability to organize and demonstrate.
Fighting Back Against ALPR
ALPR systems are designed to capture information on every vehicle that passes within view. That means they don’t just capture data on “criminals” but on everyone, all the time—and that includes people engaged in their First Amendment right to publicly dissent. Police are sitting on massive troves of data that can reveal who attended a protest, and this data shows they are not afraid to use it.
Our analysis only includes data where agencies explicitly mentioned protests or related terms in the “reason” field when documenting their search. It’s likely that scores more were conducted under less obvious pretexts and search reasons. According to our analysis, approximately 20 percent of all searches we reviewed listed vague language like “investigation,” “suspect,” and “query” in the reason field. Those terms could well be cover for spying on a protest, an abortion prosecution, or an officer stalking a spouse, and no one would be the wiser–including the agencies whose data was searched. Flock has said it will now require officers to select a specific crime under investigation, but that can and will also be used to obfuscate dubious searches.
For protestors, this data should serve as confirmation that ALPR surveillance has been and will be used to target activities protected by the First Amendment. Depending on your threat model, this means you should think carefully about how you arrive at protests, and explore options such as by biking, walking, carpooling, taking public transportation, or simply parking a little further away from the action. Our Surveillance Self-Defense project has more information on steps you could take to protect your privacy when traveling to and attending a protest.
For local officials, this should serve as another example of how systems marketed as protecting your community may actually threaten the values your communities hold most dear. The best way to protect people is to shut down these camera networks.
Everyone should have the right to speak up against injustice without ending up in a database.
More than 80 law enforcement agencies across the United States have used language perpetuating harmful stereotypes against Romani people when searching the nationwide Flock Safety automated license plate reader (ALPR) network, according to audit logs obtained and analyzed by the Electronic Frontier Foundation.
When police run a search through the Flock Safety network, which links thousands of ALPR systems, they are prompted to leave a reason and/or case number for the search. Between June 2024 and October 2025, cops performed hundreds of searches for license plates using terms such as “roma” and “g*psy,” and in many instances, without any mention of a suspected crime. Other uses include “g*psy vehicle,” “g*psy group,” “possible g*psy,” “roma traveler” and “g*psy ruse,” perpetuating systemic harm by demeaning individuals based on their race or ethnicity.
These queries were run through thousands of police departments’ systems—and it appears that none of these agencies flagged the searches as inappropriate.
These searches are, by definition, racist.
Word Choices and Flock Searches
We are using the terms “Roma” and “Romani people” as umbrella terms, recognizing that they represent different but related groups. Since 2020, the U.S. federal government has officially recognized “Anti-Roma Racism” as including behaviors such as “stereotyping Roma as persons who engage in criminal behavior” and using the slur “g*psy.” According to the U.S. Department of State, this language “leads to the treatment of Roma as an alleged alien group and associates them with a series of pejorative stereotypes and distorted images that represent a specific form of racism.”
Nevertheless, police officers have run hundreds of searches for license plates using the terms “roma” and “g*psy.” (Unlike the police ALPR queries we’ve uncovered, we substitute an asterisk for the Y to avoid repeating this racist slur). In many cases, these terms have been used on their own, with no mention of crime. In other cases, the terms have been used in contexts like “g*psy scam” and “roma burglary,” when ethnicity should have no relevance to how a crime is investigated or prosecuted.
A “g*psy scam” and “roma burglary” do not exist in criminal law separate from any other type of fraud or burglary. Several agencies contacted by EFF have since acknowledged the inappropriate use and expressed efforts to address the issue internally.
“The use of the term does not reflect the values or expected practices of our department,” a representative of the Palos Heights (IL) Police Department wrote to EFF after being confronted with two dozen searches involving the term “g*psy.” “We do not condone the use of outdated or offensive terminology, and we will take this inquiry as an opportunity to educate those who are unaware of the negative connotation and to ensure that investigative notations and search reasons are documented in a manner that is accurate, professional, and free of potentially harmful language.”
Of course, the broader issue is that allowing “g*psy” or “Roma” as a reason for a search isn’t just offensive, it implies the criminalization an ethnic group. In fact, the Grand Prairie Police Department in Texas searched for “g*psy” six times while using Flock’s “Convoy” feature, which allows an agency to identify vehicles traveling together—in essence targeting an entire traveling community of Roma without specifying a crime.
At the bottom of this post is a list of agencies and the terms they used when searching the Flock system.
Anti-Roma Racism in an Age of Surveillance
Racism against Romani people has been a problem for centuries, with one of its most horrific manifestations during the Holocaust, when the Third Reich and its allies perpetuated genocide by murdering hundreds of thousands of Romani people and sterilizing thousands more. Despite efforts by the UN and EU to combat anti-Roma discrimination, this form of racism persists. As scholars Margareta Matache and Mary T. Bassett explain, it is perpetuated by modern American policing practices:
In recent years, police departments have set up task forces specialised in “G*psy crimes”, appointed “G*psy crime” detectives, and organised police training courses on “G*psy criminality”. The National Association of Bunco Investigators (NABI), an organisation of law enforcement professionals focusing on “non-traditional organised crime”, has even created a database of individuals arrested or suspected of criminal activity, which clearly marked those who were Roma.
Thus, it is no surprise that a 2020 Harvard University survey of Romani Americans found that 4 out of 10 respondents reported being subjected to racial profiling by police. This demonstrates the ongoing challenges they face due to systemic racism and biased policing.
Notably, many police agencies using surveillance technologies like ALPRs have adopted some sort of basic policy against biased policing or the use of these systems to target people based on race or ethnicity. But even when such policies are in place, an agency’s failure to enforce them allows these discriminatory practices to persist. These searches were also run through the systems of thousands of other police departments that may have their own policies and state laws that prohibit bias-based policing—yet none of those agencies appeared to have flagged the searches as inappropriate.
The Flock search data in question here shows that surveillance technology exacerbates racism, and even well-meaning policies to address bias can quickly fall apart without proper oversight and accountability.
Cops In Their Own Words
EFF reached out to a sample of the police departments that ran these searches. Here are five representative responses we received from police departments in Illinois, California, and Virginia. They do not inspire confidence.
1. Lake County Sheriff’s Office, IL
In June 2025, the Lake County Sheriff’s Office ran three searches for a dark colored pick-up truck, using the reason: “G*PSY Scam.” The search covered 1,233 networks, representing 14,467 different ALPR devices.
In response to EFF, a sheriff’s representative wrote via email:
“Thank you for reaching out and for bringing this to our attention. We certainly understand your concern regarding the use of that terminology, which we do not condone or support, and we want to assure you that we are looking into the matter.
Any sort of discriminatory practice is strictly prohibited at our organization. If you have the time to take a look at our commitment to the community and our strong relationship with the community, I firmly believe you will see discrimination is not tolerated and is quite frankly repudiated by those serving in our organization.
We appreciate you bringing this to our attention so we can look further into this and address it.”
2. Sacramento Police Department, CA
In May 2025, the Sacramento Police Department ran six searches using the term “g*psy.” The search covered 468 networks, representing 12,885 different ALPR devices.
In response to EFF, a police representative wrote:
“Thank you again for reaching out. We looked into the searches you mentioned and were able to confirm the entries. We’ve since reminded the team to be mindful about how they document investigative reasons. The entry reflected an investigative lead, not a disparaging reference.
We appreciate the chance to clarify.”
3. Palos Heights Police Department, IL
In September 2024, the Palos Heights Police Department ran more than two dozen searches using terms such as “g*psy vehicle,” “g*psy scam” and “g*psy concrete vehicle.” Most searches hit roughly 1,000 networks.
In response to EFF, a police representative said the searches were related to a singular criminal investigation into a vehicle involved in a “suspicious circumstance/fraudulent contracting incident” and is “not indicative of a general search based on racial or ethnic profiling.” However, the agency acknowledged the language was inappropriate:
“The use of the term does not reflect the values or expected practices of our department. We do not condone the use of outdated or offensive terminology, and we will take this inquiry as an opportunity to educate those who are unaware of the negative connotation and to ensure that investigative notations and search reasons are documented in a manner that is accurate, professional, and free of potentially harmful language.
We appreciate your outreach on this matter and the opportunity to provide clarification.”
4. Irvine Police Department, CA
In February and May 2025, the Irvine Police Department ran eight searches using the term “roma” in the reason field. The searches covered 1,420 networks, representing 29,364 different ALPR devices.
In a call with EFF, an IPD representative explained that the cases were related to a series of organized thefts. However, they acknowledged the issue, saying, “I think it’s an opportunity for our agency to look at those entries and to use a case number or use a different term.”
5. Fairfax County Police Department, VA
Between December 2024 and April 2025, the Fairfax County Police Department ran more than 150 searches involving terms such as “g*psy case” and “roma crew burglaries.” Fairfax County PD continued to defend its use of this language.
In response to EFF, a police representative wrote:
“Thank you for your inquiry. When conducting searches in investigative databases, our detectives must use the exact case identifiers, terms, or names connected to a criminal investigation in order to properly retrieve information. These entries reflect terminology already tied to specific cases and investigative files from other agencies, not a bias or judgment about any group of people. The use of such identifiers does not reflect bias or discrimination and is not inconsistent with our Bias-Based Policing policy within our Human Relations General Order.”
A National Trend
Roma individuals and families are not the only ones being systematically and discriminatorily targeted by ALPR surveillance technologies. For example, Flock audit logs show agencies ran 400 more searches using terms targeting Traveller communities more generally, with a specific focus on Irish Travellers, often without any mention of a crime.
Across the country, these tools are enabling and amplifying racial profiling by embedding longstanding policing biases into surveillance technologies. For example, data from Oak Park, IL, show that 84% of drivers stopped in Flock-related traffic incidents were Black—despite Black people making up only 19% of the local population. ALPR systems are far from being neutral tools for public safety and are increasingly being used to fuel discriminatory policing practices against historically marginalized people.
The racially coded language in Flock’s logs mirrors long-standing patterns of discriminatory policing. Terms like “furtive movements,” “suspicious behavior,” and “high crime area” have always been cited by police to try to justify stops and searches of Black, Latine, and Native communities. These phrases might not appear in official logs because they’re embedded earlier in enforcement—in the traffic stop without clear cause, the undocumented stop-and-frisk, the intelligence bulletin flagging entire neighborhoods as suspect. They function invisibly until a body-worn camera, court filing, or audit brings them to light. Flock’s network didn’t create racial profiling; it industrialized it, turning deeply encoded and vague language into scalable surveillance that can search thousands of cameras across state lines.
The Path Forward
U.S. Sen. Ron Wyden, D-OR, recently recommended that local governments reevaluate their decisions to install Flock Safety in their communities. We agree, but we also understand that sometimes elected officials need to see the abuse with their own eyes first.
We know which agencies ran these racist searches, and they should be held accountable. But we also know that the vast majority of Flock Safety’s clients—thousands of police and sheriffs—also allowed those racist searches to run through their Flock Safety systems unchallenged.
Elected officials must act decisively to address the racist policing enabled by Flock’s infrastructure. First, they should demand a complete audit of all ALPR searches conducted in their jurisdiction and a review of search logs to determine (a) whether their police agencies participated in discriminatory policing and (b) what safeguards, if any, exist to prevent such abuse. Second, officials should institute immediate restrictions on data-sharing through Flock’s nationwide network. As demonstrated by California law, for example, police agencies should not be able to share their ALPR data with federal authorities or out-of-state agencies, thus eliminating a vehicle for discriminatory searches spreading across state lines.
Ultimately, elected officials must terminate Flock Safety contracts entirely. The evidence is now clear: audit logs and internal policies alone cannot prevent a surveillance system from becoming a tool for racist policing. The fundamental architecture of Flock—thousands of cameras feeding into a nationwide searchable network—makes discrimination inevitable when enforcement mechanisms fail.
As Sen. Wyden astutely explained, “local elected officials can best protect their constituents from the inevitable abuses of Flock cameras by removing Flock from their communities.”
Table Overview and Notes
The following table compiles terms used by agencies to describe the reasons for searching the Flock Safety ALPR database. In a small number of cases, we removed additional information such as case numbers, specific incident details, and officers’ names that were present in the reason field.
We removed one agency from the list due to the agency indicating that the word was a person’s name and not a reference to Romani people.
In general, we did not include searches that used the term “Romanian,” although many of those may also be indicative of anti-Roma bias. We also did not include uses of “traveler” or “Traveller” when it did not include a clear ethnic modifier; however, we believe many of those searches are likely relevant.
A text-based version of the spreadsheet is available here.
New documents and court records obtained by EFF show that Texas deputies queried Flock Safety’s surveillance data in an abortion investigation, contradicting the narrative promoted by the company and the Johnson County Sheriff that she was “being searched for as a missing person,” and that “it was about her safety.”
The new information shows that deputies had initiated a “death investigation” of a “non-viable fetus,” logged evidence of a woman’s self-managed abortion, and consulted prosecutors about possibly charging her.
Johnson County Sheriff Adam King repeatedly denied the automated license plate reader (ALPR) search was related to enforcing Texas’s abortion ban, and Flock Safety called media accounts “false,” “misleading” and “clickbait.” However, according to a sworn affidavit by the lead detective, the case was in fact a death investigation in response to a report of an abortion, and deputies collected documentation of the abortion from the “reporting person,” her alleged romantic partner. The death investigation remained open for weeks, with detectives interviewing the woman and reviewing her text messages about the abortion.
The documents show that the Johnson County District Attorney’s Office informed deputies that “the State could not statutorily charge [her] for taking the pill to cause the abortion or miscarriage of the non-viable fetus.”
An excerpt from the JCSO detective’s sworn affidavit.
The records include previously unreported details about the case that shocked public officials and reproductive justice advocates across the country when it was first reported by 404 Media in May. The case serves as a clear warning sign that when data from ALPRs is shared across state lines, it can put people at risk, including abortion seekers. And, in this case, the use may have run afoul of laws in Washington and Illinois.
A False Narrative Emerges
Last May, 404 Media obtained data revealing the Johnson County Sheriff’s Office conducted a nationwide search of more than 83,000 Flock ALPR cameras, giving the reason in the search log: “had an abortion, search for female.” Both the Sheriff’s Office and Flock Safety have attempted to downplay the search as akin to a search for a missing person, claiming deputies were only looking for the woman to “check on her welfare” and that officers found a large amount of blood at the scene – a claim now contradicted by the responding investigator’s affidavit. Flock Safety went so far as to assert that journalists and advocates covering the story intentionally misrepresented the facts, describing it as “misreporting” and “clickbait-driven.”
As Flock wrote of EFF’s previous commentary on this case (bold in original statement):
Earlier this month, there was purposefully misleading reporting that a Texas police officer with the Johnson County Sheriff’s Office used LPR “to target people seeking reproductive healthcare.” This organization is actively perpetuating narratives that have been proven false, even after the record has been corrected.
According to the Sheriff in Johnson County himself, this claim is unequivocally false.
… No charges were ever filed against the woman and she was never under criminal investigation by Johnson County. She was being searched for as a missing person, not as a suspect of a crime.
That sheriff has since been arrested and indicted on felony counts in an unrelated sexual harassment and whistleblower retaliation case. He has also been charged with aggravated perjury for allegedly lying to a grand jury. EFF filed public records requests with Johnson County to obtain a more definitive account of events.
The newly released incident report and affidavit unequivocally describe the case as a “death investigation” of a “non-viable fetus.” These documents also undermine the claim that the ALPR search was in response to a medical emergency, since, in fact, the abortion had occurred more than two weeks before deputies were called to investigate.
In recent years, anti-abortion advocates and prosecutors have increasingly attempted to use “fetal homicide” and “wrongful death” statutes – originally intended to protect pregnant people from violence – to criminalize abortion and pregnancy loss. These laws, which exist in dozens of states, establish legal personhood of fetuses and can be weaponized against people who end their own pregnancies or experience a miscarriage.
In fact, a new report from Pregnancy Justice found that in just the first two years since the Supreme Court’s decision in Dobbs, prosecutors initiated at least 412 cases charging pregnant people with crimes related to pregnancy, pregnancy loss, or birth–most under child neglect, endangerment, or abuse laws that were never intended to target pregnant people. Nine cases included allegations around individuals’ abortions, such as possession of abortion medication or attempts to obtain an abortion–instances just like this one. The report also highlights how, in many instances, prosecutors use tangentially related criminal charges to punish people for abortion, even when abortion itself is not illegal.
By framing their investigation of a self-administered abortion as a “death investigation” of a “non-viable fetus,” Texas law enforcement was signaling their intent to treat the woman’s self-managed abortion as a potential homicide, even though Texas law does not allow criminal charges to be brought against an individual for self-managing their own abortion.
The Investigator’s Sworn Account
Over two days in April, the woman went through the process of taking medication to induce an abortion. Two weeks later, her partner–who would later be charged with domestic violence against her–reported her to the sheriff’s office.
The documents confirm that the woman was not present at the home when the deputies “responded to the death (Non-viable fetus).” As part of the investigation, officers collected evidence that the man had assembled of the self-managed abortion, including photographs, the FedEx envelope the medication arrived in, and the instructions for self-administering the medication.
Another Johnson County official ran two searches through the ALPR database with the note “had an abortion, search for female,” according to Flock Safety search logs obtained by EFF. The first search, which has not been previously reported, probed 1,295 Flock Safety networks–composed of 17,684 different cameras–going back one week. The second search, which was originally exposed by 404 Media, was expanded to a full month of data across 6,809 networks, including 83,345 cameras. Both searches listed the same case number that appears on the death investigation/incident report obtained by EFF.
After collecting the evidence from the woman’s partner, the investigators say they consulted the district attorney’s office, only to be told they could not press charges against the woman.
An excerpt from the JCSO detective’s sworn affidavit.
Nevertheless, when the subject showed up at the Sheriff’s office a week later, officers were under the impression that she came to “to tell her side of the story about the non-viable fetus.” They interviewed her, inspected text messages about the abortion on her phone, and watched her write a timeline of events.
Only after all that did they learn that she actually wanted to report a violent assault by her partner–the same individual who had called the police to report her abortion. She alleged that less than an hour after the abortion, he choked her, put a gun to her head, and made her beg for her life. The man was ultimately charged in connection with the assault, and the case is ongoing.
This documented account runs completely counter to what law enforcement and Flock have said publicly about the case.
Johnson County Sheriff Adam King told 404 media: “Her family was worried that she was going to bleed to death, and we were trying to find her to get her to a hospital.” He later told the Dallas Morning News: “We were just trying to check on her welfare and get her to the doctor if needed, or to the hospital.”
The account by the detective on the scene makes no mention of concerned family members or a medical investigator. To the contrary, the affidavit says that they questioned the man as to why he “waited so long to report the incident,” and he responded that he needed to “process the event and call his family attorney.” The ALPR search was recorded 2.5 hours after the initial call came in, as documented in the investigation report.
The Desk Sergeant’s Report—One Month Later
EFF obtained a separate “case supplemental report” written by the sergeant who says he ran the May 9 ALPR searches.
The sergeant was not present at the scene, and his account was written belatedly on June 5, almost a month after the incident and nearly a week after 404 Media had already published the sheriff’s alternative account of the Flock Safety search, kicking off a national controversy. The sheriff’s office provided this sergeant’s report to Dallas Morning News.
In the report, the sergeant claims that the officers on the ground asked him to start “looking up” the woman due to there being “a large amount of blood” found at the residence—an unsubstantiated claim that is in conflict with the lead investigator’s affidavit. The sergeant repeatedly expresses that the situation was “not making sense.” He claims he was worried that the partner had hurt the woman and her children, so “to check their welfare,” he used TransUnion’s TLO commercial investigative database system to look up her address. Once he identified her vehicle, he ran the plate through the Flock database, returning hits in Dallas.
Two abortion-related searches in the JCSO’s Flock Safety ALPR audit log
The sergeant’s report, filed after the case attracted media attention, notably omits any mention of the abortion at the center of the investigation, although it does note that the caller claimed to have found a fetus. The report does not explain, or even address, why the sergeant used the phrase “had an abortion, search for female” as the official reason for the ALPR searches in the audit log.
It’s also unclear why the sergeant submitted the supplemental report at all, weeks after the incident. By that time, the lead investigator had already filed a sworn affidavit that contradicted the sergeant’s account. For example, the investigator, who was on the scene, does not describe finding any blood or taking blood samples into evidence, only photographs of what the partner believed to be the fetus.
One area where they concur: both reports are clearly marked as a “death investigation.”
Correcting the Record
Since 404 Media first reported on this case, King has perpetuated the false narrative, telling reporters that the woman was never under investigation, that officers had not considered charges against her, and that “it was all about her safety.”
But here are the facts:
The reports that have been released so far describe this as a death investigation.
The lead detective described himself as “working a death investigation… of a non-viable fetus” at the time he interviewed the woman (a week after the ALPR searches).
The detective wrote that they consulted the district attorney’s office about whether they could charge her for “taking the pill to cause the abortion or miscarriage of the non-viable fetus.” They were told they could not.
Investigators collected a lot of data, including photos and documentation of the abortion, and ran her through multiple databases. They even reviewed her text messages about the abortion.
The death investigation was open for more than a month.
The death investigation was only marked closed in mid-June, weeks after 404 Media’s article and a mere days before the Dallas Morning News published its report, in which the sheriff inaccurately claimed the woman “was not under investigation at any point.”
Flock has promoted this unsupported narrative on its blog and in multimediaappearances. We did not reach out to Flock for comment on this article, as their communications director previously told us the company will not answer our inquiries until we “correct the record and admit to your audience that you purposefully spread misinformation which you know to be untrue” about this case.
Consider the record corrected: It turns out the truth is even more damning than initially reported.
The Aftermath
In the aftermath of the original reporting, government officials began to take action. The networks searched by Johnson County included cameras in Illinois and Washington state, both states where abortion access is protected by law. Since then:
The Illinois Secretary of State has announced his intent to “crack down on unlawful use of license plate reader data,” and urged the state’s Attorney General to investigate the matter.
In California, which also has prohibitions on sharing ALPR out of state and for abortion-ban enforcement, the legislature cited the case in support of pending legislation to restrict ALPR use.
Ranking Members of the House Oversight Committee and one of its subcommittees launched a formal investigation into Flock’s role in “enabling invasive surveillance practices that threaten the privacy, safety, and civil liberties of women, immigrants, and other vulnerable Americans.”
Senator Ron Wyden secured a commitment from Flock to protect Oregonians’ data from out-of-state immigration and abortion-related queries.
In response to mounting pressure, Flock announced a series of new features supposedly designed to prevent future abuses. These include blocking “impermissible” searches, requiring that all searches include a “reason,” and implementing AI-driven audit alerts to flag suspicious activity. But as we’ve detailed elsewhere, these measures are cosmetic at best—easily circumvented by officers using vague search terms or reusing legitimate case numbers. The fundamental architecture that enabled the abuse remains unchanged.
Meanwhile, as the news continued to harm the company’s sales, Flock CEO Garrett Langley embarked on a press tour to smear reporters and others who had raised alarms about the usage. In an interview with Forbes, he even doubled down and extolled the use of the ALPR in this case.
So when I look at this, I go “this is everything’s working as it should be.” A family was concerned for a family member. They used Flock to help find her, when she could have been unwell. She was physically okay, which is great. But due to the political climate, this was really good clickbait.
Nothing about this is working as it should, but it is working as Flock designed.
The Danger of Unchecked Surveillance
This case reveals the fundamental danger of allowing companies like Flock Safety to build massive, interconnected surveillance networks that can be searched across state lines with minimal oversight. When a single search query can access more than 83,000 cameras spanning almost the entire country, the potential for abuse is staggering, particularly when weaponized against people seeking reproductive healthcare.
The searches in this case may have violated laws in states like Washington and Illinois, where restrictions exist specifically to prevent this kind of surveillance overreach. But those protections mean nothing when a Texas deputy can access cameras in those states with a few keystrokes, without external review that the search is legal and legitimate under local law. In this case, external agencies should have seen the word “abortion” and questioned the search, but the next time an officer is investigating such a case, they may use a more vague or misleading term to justify the search. In fact, it’s possible it has already happened.
ALPRs were marketed to the public as tools to find stolen cars and locate missing persons. Instead, they’ve become a dragnet that allows law enforcement to track anyone, anywhere, for any reason—including investigating people’s healthcare decisions. This case makes clear that neither the companies profiting from this technology nor the agencies deploying it can be trusted to tell the full story about how it’s being used.
States must ban law enforcement from using ALPRs to investigate healthcare decisions and prohibit sharing data across state lines. Local governments may try remedies like reducing data retention period to minutes instead of weeks or months—but, really, ending their ALPR programs altogether is the strongest way to protect their most vulnerable constituents. Without these safeguards, every license plate scan becomes a potential weapon against a person seeking healthcare.
Two recent statements from the surveillance company—one addressing Illinois privacy violations and another defending the company’s national surveillance network—reveal a troubling pattern: when confronted by evidence of widespread abuse, Flock Safety has blamed users, downplayed harms, and doubled down on the very systems that enabled the violations in the first place.
Flock’s aggressive public relations campaign to salvage its reputation comes as no surprise. Last month, we described how investigative reporting from 404 Media revealed that a sheriff’s office in Texas searched data from more than 83,000 automated license plate reader (ALPR) cameras to track down a woman suspected of self-managing an abortion. (A scenario that may have been avoided, it’s worth noting, had Flock taken action when they were first warned about this threat three years ago).
Flock calls the reporting on the Texas sheriff’s office “purposefully misleading,” claiming the woman was searched for as a missing person at her family’s request rather than for her abortion. But that ignores the core issue: this officer used a nationwide surveillance dragnet (again: over 83,000 cameras) to track someone down, and used her suspected healthcare decisions as a reason to do so. Framing this as concern for her safety plays directly into anti-abortion narratives that depict abortion as dangerous and traumatic in order to justify increased policing, criminalization, control—and, ultimately, surveillance.
As if that weren’t enough, the company has also come under fire for how its ALPR network data is being actively used to assist in mass deportation. Despite U.S. Immigration and Customs Enforcement (ICE) having no formal agreement with Flock Safety, public records revealed “more than 4,000 nation and statewide lookups by local and state police done either at the behest of the federal government or as an ‘informal’ favor to federal law enforcement, or with a potential immigration focus.” The network audit data analyzed by 404 exposed an informal data-sharing environment that creates an end-run around oversight and accountability measures: federal agencies can access the surveillance network through local partnerships without the transparency and legal constraints that would apply to direct federal contracts.
Flock Safety is adamant this is “not Flock’s decision,” and by implication, not their fault. Instead, the responsibility lies with each individual local law enforcement agency. In the same breath, they insist that data sharing is essential, loudly claiming credit when the technology is involved in cross-jurisdictional investigations—but failing to show the same attitude when that data-sharing ecosystem is used to terrorize abortion seekers or immigrants.
Flock Safety: The Surveillance Social Network
In growing from a 2017 startup to a $7.5 billion company “serving over 5,000 communities,” Flock allowed individual agencies wide berth to set and regulate their own policies. In effect, this approach offered cheap surveillance technology with minimal restrictions, leaving major decisions and actions in the hands of law enforcement while the company scaled rapidly.
And they have no intention of slowing down. Just this week, Flock launched its Business Network, facilitating unregulated data sharing amongst its private sector security clients. “For years, our law enforcement customers have used the power of a shared network to identify threats, connect cases, and reduce crime. Now, we’re extending that same network effect to the private sector,” Flock Safety’s CEO announced.
The company is building out a new mass surveillance network using the exact template that ended with the company having to retrain thousands of officers in Illinois on how not to break state law—the same template that made it easy for officers to do so in the first place. Flock’s continued integration of disparate surveillance networks across the public and private spheres—despite the harms that have already occurred—is owed in part to the one thing that it’s gotten really good at over the past couple of years: facilitating a surveillance social network.
Employing marketing phrases like “collaboration” and “force multiplier,” Flock encourages as much sharing as possible, going as far as to claim that network effects can significantly improve case closure rates. They cultivate a sense of shared community and purpose among users so they opt into good faith sharing relationships with other law enforcement agencies across the country. But it’s precisely that social layer that creates uncontrollable risk.
The possibility of human workarounds at every level undermines any technical safeguards Flock may claim. Search term blocking relies on officers accurately labeling search intent—a system easily defeated by entering vague reasons like “investigation” or incorrect justifications, made either intentionally or not. And, of course, words like “investigation” or “missing person” can mean virtually anything, offering no value to meaningful oversight of how and for what the system is being used. Moving forward, sheriff’s offices looking to avoid negative press can surveil abortion seekers or immigrants with ease, so long as they use vague and unsuspecting reasons.
The same can be said for case number requirements, which depend on manual entry. This can easily be circumvented by reusing legitimate case numbers for unauthorized searches. Audit logs only track inputs, not contextual legitimacy. Flock’s proposed AI-driven audit alerts, something that may be able to flag suspicious activity after searches (and harm) have already occurred, relies on local agencies to self-monitor misuse—despite their demonstrated inability to do so.
And, of course, even the most restrictive department policy may not be enough. Austin, Texas, had implemented one of the most restrictive ALPR programs in the country, and the program still failed: the city’s own audit revealed systematic compliance failures that rendered its guardrails meaningless. The company’s continued appeal to “local policies” means nothing when Flock’s data-sharing network does not account for how law enforcement policies, regulations, and accountability vary by jurisdiction. You may have a good relationship with your local police, who solicit your input on what their policy looks like; you don’t have that same relationship with hundreds or thousands of other agencies with whom they share their data. So if an officer on the other side of the country violates your privacy, it’d be difficult to hold them accountable.
ALPR surveillance systems are inherently vulnerable to both technical exploitation and human manipulation. These vulnerabilities are not theoretical—they represent real pathways for bad actors to access vast databases containing millions of Americans’ location data. When surveillance databases are breached, the consequences extend far beyond typical data theft—this information can be used to harass, stalk, or even extort. The intimate details of people’s daily routines, their associations, and their political activities may become available to anyone with malicious intent. Flock operates as a single point of failure that can compromise—and has compromised—the privacy of millions of Americans simultaneously.
Don’t Stop de-Flocking
Rather than addressing legitimate concerns about privacy, security, and constitutional rights, Flock has only promised updates that fall short of meaningful reforms. These software tweaks and feature rollouts cannot assuage the fear engendered by the massive surveillance system it has built and continues to expand.
Flock’s insistence that what’s happening with abortion criminalization and immigration enforcement has nothing to do with them—that these are just red-state problems or the fault of rogue officers—is concerning. Flock designed the network that is being used, and the public should hold them accountable for failing to build in protections from abuse that cannot be easily circumvented.
Thankfully, that’s exactly what’s happening: cities like Austin, San Marcos, Denver, Norfolk, and San Diego are pushing back. And it’s not nearly as hard a choice as Flock would have you believe: Austinites are weighing the benefits of a surveillance system that generates a hit less than 0.02% of the time against the possibility that scanning 75 million license plates will result in an abortion seeker being tracked down by police, or an immigrant being flagged by ICE in a so-called “sanctuary city.” These are not hypothetical risks. It is already happening.
Given how pervasive, sprawling, and ungovernable ALPR sharing networks have become, the only feature update we can truly rely on to protect people’s rights and safety is no network at all. And we applaud the communities taking decisive action to dismantle its surveillance infrastructure.
Here’s yet another worrying development in the world of privately-owned security cameras. Flock Safety has made aggressive in-roads in both the private and public sector, something aided greatly by the company’s ability to blend the two.
Much like Ring before it, Flock is pitching cheap cameras with local law enforcement buy-in, nudging residents towards leaving their cameras (some of which have license plate reader capabilities) open so law enforcement can search their plate captures without a warrant. Law enforcement agencies are also buying their own cameras to ensure people can’t travel very far without leaving at least a temporary record of their travels the government can access pretty much at will.
And this is how that meshing of public-private is playing out in real life. As Joseph Cox and Jason Koebler report for 404 Media, at least one law enforcement officer has used this meshed network of Flock ALPR cameras to help locate a woman who recently had an abortion.
On May 9, an officer from the Johnson County Sheriff’s Office in Texas searched Flock cameras and gave the reason as “had an abortion, search for female,” according to the multiple sets of data. Whenever officers search Flock cameras they are required to provide a reason for doing so, but generally do not require a warrant or any sort of court order. Flock cameras continually scan the plates, color, and model of any vehicle driving by, building a detailed database of vehicles and by extension peoples’ movements.
Cops are able to search cameras acquired in their own district, those in their state, or those in a nationwide network of Flock cameras. That single search for the woman spread across 6,809 different Flock networks, with a total of 83,345 cameras, according to the data. The officer looked for hits over a month long period, it shows.
Some of these cameras were likely owned and operated by private purchasers. But even with those excluded, it’s still a massive data set the government can access without having to offer up much in the way of justification. The justification here (one that was reflected in access audits from Flock systems located as far away as Washington state) seems especially ominous and especially flimsy: “had an abortion, search for female.”
The Johnson County Sheriff’s Office claims this search was performed to help, not harm.
Sheriff Adam King of the Johnson County Sheriff’s Office told 404 Media in a phone call that the woman self-administered the abortion “and her family was worried that she was going to bleed to death, and we were trying to find her to get her to a hospital.”
“We weren’t trying to block her from leaving the state or whatever to get an abortion,” he said. “It was about her safety.”
Even if that’s completely true, it’s not that comforting to know Texas law enforcement officers can perform the same searches for the purpose of prosecuting people who have sought abortions in nearby states where this is still legal. The justifications offered during the acquisition process always stresses the equipment will be used to deal with the most violent crimes. While utilizing the tech to search for a missing person is something most people would find acceptable, its proximity to the state’s recent abortion ban definitely isn’t an encouraging sign.
If these tools can be used this way, you can guarantee they will be used this way. Once one law enforcement agency gets the ball rolling on abortion arrests and weathers the press storm that it will provoke, the rest will follow suit, especially in areas populated by prosecutors with anti-abortion beliefs. Companies like Flock will just make everything easier for people looking to punish women for daring to explore their options and retain what’s left of their bodily autonomy.
When your local police department buys one piece of surveillance equipment, you can easily expect that the company that sold it will try to upsell them on additional tools and upgrades.
At the end of the day, public safety vendors are tech companies, and their representatives are salespeople using all the tricks from the marketing playbook. But these companies aren’t just after public money—they also want data.
And each new bit of data that police collect contributes to a pool of information to which the company can attach other services: storage, data processing, cross-referencing tools, inter-agency networking, and AI analysis. The companies may even want the data to train their own AI model. The landscape of the police tech industry is changing, and companies that once specialized in a single technology (such as hardware products like automated license plate readers (ALPRs) or gunshot detection sensors) have developed new capabilities or bought up other tech companies and law enforcement data brokers—all in service of becoming the corporate giant that serves as a one-stop shop for police surveillance needs.
One of the most alarming trends in policing is that companies are regularly pushing police to buy more than they need. Vendors regularly pressure police departments to lock in the price now for a whole bundle of features and tools in the name of “cost savings,” often claiming that the cost à la carte for any of these tools will be higher than the cost of a package, which they warn will also be priced more expensively in the future. Market analysts have touted the benefits of creating “moats” between these surveillance ecosystems and any possible competitors. By making it harder to switch service providers due to integrated features, these companies can lock their cop customers into multi-year subscriptions and long-term dependence.
Think your local police are just getting body-worn cameras (BWCs) to help with public trust or ALPRs to aid their hunt for stolen vehicles? Don’t assume that’s the end of it. If there’s already a relationship between a company and a department, that department is much more likely to get access to a free trial of whatever other device or software that company hopes the department will put on its shopping list.
These vendors also regularly help police departments apply for grants and waivers, and provide other assistance to find funding, so that as soon as there’s money available for a public safety initiative, those funds can make their way directly to their business.
Companies like Axon have been particularly successful at using their relationships and leveraging the ability to combine equipment into receiving “sole source” designations. Typically, government agencies must conduct a bidding process when buying a new product, be it toilet paper, computers, or vehicles. For a company to be designated a sole-source provider, it is supposed to provide a product that no other vendor can provide. If a company can get this designation, it can essentially eliminate any possible competition for particular government contracts. When Axon is under consideration as a vendor for equipment like BWCs, for which there are multiple possible other providers, it’s not uncommon to see a police department arguing for a sole-source procurement for Axon BWCs based on the company’s ability to directly connect their cameras to the Fusus system, another Axon product.
Here are a few of the big players positioning themselves to collect your movements, analyze your actions, and make you—the taxpayer—bear the cost for the whole bundle of privacy invasions.
Axon Enterprise’s ‘Suite’
Axon expects to have yet another year of $2 billion-plus in revenue in 2025. The company first got its hooks into police departments through the Taser, the electric stun gun. Axon then plunged into the BWC market amidst Obama-era outrage at police brutality and the flood of grant money flowing from the federal government to local police departments for BWCs, which were widely promoted as a police accountability tool. Axon parlayed its relationships with hundreds of police departments and capture and storage of growing terabytes of police footage into a menu of new technological offerings.
In its annual year-end securities filing, Axon told investors it was “building the public safety operating system of the future” through its suite of “cloud-hosted digital evidence management solutions, productivity and real-time operations software, body cameras, in-car cameras, TASER energy devices, robotic security and training solutions” to cater to agencies in the federal, corrections, justice, and security sectors.”
Axon controls an estimated 85 percent of the police body-worn camera market. Its Evidence.com platform, once a trial add-on for BWC customers, is now also one of the biggest records management systems used by police. Its other tools and services include record management, video storage in the cloud, drones, connected private cameras, analysis tools, virtual reality training, and real-time crime centers.
An image from the Quarter 4 2024 slide deck for investors, which describes different levels of the “Officer Safety Plan” (OSP) product package and highlights how 95% of Axon customers are tied to a subscription plan.
Axon has been adding AI to its repertoire, and it now features a whole “AI Era” bundle plan. One recent offering is Draft One, which connects to Axon’s body-worn cameras (BWCs) and uses AI to generate police reports based on the audio captured in the BWC footage. While use of the tool may start off as a free trial, Axon sees Draft One as another key product for capturing new customers, despite widespread skepticism of the accuracy of the reports, the inability to determine which reports have been drafted using the system, and the liability they could bring to prosecutions.
In 2024, Axon acquired a company called Fusus, a platform that combines the growing stores of data that police departments collect—notifications from gunshot detection and automated license plate reader (ALPR) systems; footage from BWCs, drones, public cameras, and sometimes private cameras; and dispatch information—to create “real-time crime centers.” The company now claims that Fusus is being used by more than 250 different policing agencies.
Fusus claims to bring the power of the real-time crime center to police departments of all sizes, which includes the ability to help police access and use live footage from both public and private cameras through an add-on service that requires a recurring subscription. It also claims to integrate nicely with surveillance tools from other providers. Recently, it has been cutting ties, most notably with Flock Safety, as it starts to envelop some of the options its frenemies had offered.
In the middle of April, Axon announced that it would begin offering fixed ALPR, a key feature of the Flock Safety catalogue, and an AI Assistant, which has been a core offering of Truleo, another Axon competitor.
Flock Safety’s Bundles and FlockOS
Flock Safety is another major police technology company that has expanded its focus from one primary technology to a whole package of equipment and software services.
Flock Safety started with ALPRs. These tools use a camera to read vehicle license plates, collecting the make, model, location, and other details which can be used for what Flock calls “Vehicle Fingerprinting.” The details are stored in a database that sometimes finds a match among a “hot list” provided by police officers, but otherwise just stores and shares data on how, where, and when everyone is driving and parking their vehicles.
Much of what Flock Safety does now comes together in their FlockOS system, which claims to bring together various surveillance feeds and facilitate real-time “situational awareness.”
When you think of Motorola, you may think of phones—but there’s a good chance that you missed the moment in 2011 when the phone side of the company, Motorola Mobility, split off from Motorola Solutions, which is now a big player in police surveillance.
On its website, Motorola Solutions claims that departments are better off using a whole list of equipment from the same ecosystem, boasting the tagline, “Technology that’s exponentially more powerful, together.” Motorola describes this as an “ecosystem of safety and security technologies” in its securities filings. In 2024, the company also reported $2 billion in sales, but unlike Axon, its customer base is not exclusively law enforcement and includes private entities like sports stadiums, schools, and hospitals.
Motorola’s technology includes 911 services, radio, BWCs, in-car cameras, ALPRs, drones, face recognition, crime mapping, and software that supposedly unifies it all. Notably, video can also come with artificial intelligence analysis, in some cases allowing law enforcement to search video and track individuals across cameras.
In January 2019, Motorola Solutions acquired Vigilant Solutions, one of the big players in the ALPR market, as part of its takeover of Vaas International Holdings. Now the company (under the subsidiary DRN Data) claims to have billions of scans saved from police departments and private ALPR cameras around the country. Marketing language for its Vehicle Manager system highlights that “data is overwhelming,” because the amount of data being collected is “a lot.” It’s a similar claim made by other companies: Now that you’ve bought so many surveillance tools to collect so much data, you’re finding that it is too much data, so you now need more surveillance tools to organize and make sense of it.
SoundThinking’s ‘SafetySmart Platform’
SoundThinking began as ShotSpotter, a so-called gunshot detection tool that uses microphones placed around a city to identify and locate sounds of gunshots. As news reports of the tool’s inaccuracy and criticisms have grown, the company has rebranded as SoundThinking, adding to its offerings ALPRs, case management, and weapons detection. The company is now marketing its SafetySmart platform, which claims to integrate different stores of data and apply AI analytics.
In 2024, SoundThinking laid out its whole scheme in its annual report, referring to it as the “cross-sell” component of their sales strategy.
The “cross-sell” component of our strategy is designed to leverage our established relationships and understanding of the customer environs by introducing other capabilities on the SafetySmart platform that can solve other customer challenges. We are in the early stages of the upsell/cross-sell strategy, but it is promising – particularly around bundled sales such as ShotSpotter + ResourceRouter and CaseBuilder +CrimeTracer. Newport News, VA, Rocky Mount, NC, Reno, NV and others have embraced this strategy and recognized the value of utilizing multiple SafetySmart products to manage the entire life cycle of gun crime…. We will seek to drive more of this sales activity as it not only enhances our system’s effectiveness but also deepens our penetration within existing customer relationships and is a proof point that our solutions are essential for creating comprehensive public safety outcomes. Importantly, this strategy also increases the average revenue per customer and makes our customer relationships even stickier.
Many of SoundThinking’s new tools rely on a push toward “data integration” and artificial intelligence. ALPRs can be integrated with ShotSpotter. ShotSpotter can be integrated with the CaseBuilder records management system, and CaseBuilder can be integrated with CrimeTracer. CrimeTracer, once known as COPLINK X, is a platform that SoundThinking describes as a “powerful law enforcement search engine and information platform that enables law enforcement to search data from agencies across the U.S.” EFF tracks this type of tool in the Atlas of Surveillance as a third-party investigative platform: software tools that combine open-source intelligence data, police records, and other data sources, including even those found on the dark web, to generate leads or conduct analyses.
SoundThinking, like a lot of surveillance, can be costly for departments, but the company seems to see the value in fostering its existing police department relationships even if they’re not getting paid right now. In Baton Rouge, budget cuts recently resulted in the elimination of the $400,000 annual contract for ShotSpotter, but the city continues to use it.
“They have agreed to continue that service without accepting any money from us for now, while we look for possible other funding sources. It was a decision that it’s extremely expensive and kind of cost-prohibitive to move the sensors to other parts of the city,” Baton Rouge Police Department Chief Thomas Morse told a local news outlet, WBRZ.
Beware the Bundle
Government surveillance is big business. The companies that provide surveillance and police data tools know that it’s lucrative to cultivate police departments as loyal customers. They’re jockeying for monopolization of the state surveillance market that they’re helping to build. While they may be marketing public safety in their pitches for products, from ALPRs to records management to investigatory analysis to AI everything, these companies are mostly beholden to their shareholders and bottom lines.
The next time you come across BWCs or another piece of tech on your city council’s agenda or police department’s budget, take a closer look to see what other strings and surveillance tools might be attached. You are not just looking at one line item on the sheet—it’s probably an ongoing subscription to a whole package of equipment designed to challenge your privacy, and no sort of discount makes that a price worth paying.
Late last fall, a number of Norfolk, Virginia residents — with the assistance of the Institute for Justice (IJ) — sued the city for blanketing Norfolk with nearly 200 automatic license plate readers (ALPRs) provided by Flock Safety.
Flock Safety made its first inroads with the private market, selling plate readers to gated communities and HOAs so busybodies could keep track of everyone driving in and out of their cul-de-sacs. Having captured that market, Flock moved on, targeting US law enforcement agencies with the promise of cheap ALPRs that could be tied into existing ALPR cameras deployed by private citizens.
It’s pretty much the Ring playbook — aggressive market growth that gives cops cheap buy-in so long as they sign long-term contracts to access images and footage. And it’s the same scheme: the implication that using consumer-oriented products will give cops instant access to a further network of privately-owned cameras.
The Flock that became a swarm scored a small win in court before this lawsuit was filed. A state judge ruled three hits from a private company’s plate reader wasn’t quite enough to trigger a Fourth Amendment violation.
The IJ and its clients disagree. The lawsuit noted the city was now infested with cameras, something the police chief himself said “creates a nice curtain of technology.” “Curtain” is pretty much a blanket when it comes to fabric-based analogies. Police chief Mark Talbot also said “It would be difficult to drive anywhere of any distance without running into a camera.” That certainly sounds like a dragnet.
On top of that, records obtained from the Norfolk PD showed there was no direct or indirect oversight of officers’ access to ALPR data, which not only included plate/location data but also descriptive information about vehicles that investigators could use as search terms, rather than just the plate number itself.
And that’s how the city found itself getting sued by residents represented by the Institute for Justice. Less than five months after filing this suit, a federal judge has ruled this case can move forward.
As the decision [PDF] points out, there’s no denying this carpeting (fabric again!) of the city with cameras creates an inescapable network of government surveillance. And that sort of thing has been addressed by the Supreme Court, as well as courts at the appellate level.
Controlling precedent has deemed certain law enforcement surveillance methods as tantamount to a drag-net, finding that these technologies violate individuals’ subjective and reasonable objective expectations of privacy and therefore constitute a Fourth Amendment search. For example, the Supreme Court held that gathering all of an individual’s cell-site location information little of seven days was a search, for it over a period of as revealed the whole of an individual’s physical movements during that period. Carpenter, 585 U.S. at 311. Similarly, the Fourth Circuit held that aerial surveillance by plane. which captured second by second images of broad swaths of the City of Baltimore for close to 12 hours a day. sufficiently tracked the whole of one’s movements and was therefore a search.
That makes it clear the government can’t simply claim public movements have no expectation of privacy. They likely don’t in the singular, but the aggregate is what’s problematic in terms of constitutionality.
This court says the long-term tracking of people’s movements (via plate/location data from a network of cameras that are, as the police chief stated, inescapable) is the sort of thing the Carpenter decision addressed, even if it dealt with a different form of long-term tracking.
Relying on Carpenter, when this Court accepts Plaintiffs’ well-pled version of the facts and draws all reasonable inferences in their favor, as is required at this stage of the proceedings, the Court concludes that it is plausible that Plaintiffs subjectively believe they have a reasonable expectation of privacy that is being violated because the Flock camera system is creating a drag-net system of surveillance that effectively tracks the whole of Plaintiffs’ physical movements.
Given this, it’s unlikely the government can successfully argue that if Norfolk residents don’t want to be tracked by ALPRs, they can simply choose to walk or use public transportation. That argument hasn’t worked in multiple Supreme Court decisions where the government has claimed that if people don’t want their phone location data accessed without a warrant by investigators, they should just leave their phones at home. In this day and age, going without a phone is about as impractical as going without a car. These are essentials of everyday life, even in cities with marvelous public transportation systems. And I doubt Norfolk places highly on the list of “Best Public Transportation Systems.”
As one plaintiff notes, he can’t even leave his own neighborhood without being photographed by up to four Flock ALPR cameras. That’s ridiculous. And, as this court has ruled at this point, it’s also possibly unconstitutional.
When construed in Plaintiff’s favor, as required at this stage of the case. the complaint alleges facts notably similar to those in Carpenter that the Supreme Court found to clearly violate society’s expectation of privacy: law enforcement secretly monitoring and cataloguing the whole of tens of thousands of individual’s movements over an extended period. In short, the Court finds that considering existing precedent, the well-pled facts plausibly allege a violation of an objectively reasonable expectation of privacy.
Now, of course, this doesn’t mean the plaintiffs have won. But the important thing at this point in the litigation is that the government hasn’t won. The suit has not been dismissed. The fight continues. And, given the tone of this decision, it appears the government will need to bring some new arguments to its defense of its ALPR dragnet because the usual stuff is foreclosed by precedent. With any luck, the Institute for Justice and its clients will, at the very least, generate a warrant requirement for access to ALPR databases — something that would be the first of its kind in this nation.
This lawsuit could not be more impeccably timed. Whether or not this timing is more fortuitous than impeccable remains to be seen, but there’s no denying the bang-bang-bang effect on display here, even if it may just be coincidental.
Last week, a Virginia federal court ruled three hits from Flock ALPR cameras wasn’t enough to trigger a Fourth Amendment violation. It reasoned this was not the same sort of post facto long-term tracking addressed by the Supreme Court’s Carpenter decision, which mainly dealt with law enforcement’s obtaining massive amounts of cell site location data from service providers without a warrant.
That decision erected a warrant requirement for obtaining this data from service providers. The limited holding said important things about tech tools lending themselves to pervasive surveillance while evading the guardrails of Fourth Amendment jurisprudence, but had little to say about slightly less pervasive surveillance using other systems that weren’t reliant on cell service providers and their location data.
The end result was a loss for the defendant, who failed to show three hits from Flock ALPR (automatic license plate reader) cameras violated his rights. The court arrived at this conclusion despite noting Flock’s cameras captured far more than just plate/location/time data. The cameras also captured distinguishing features of vehicles passing its camera and allowed law enforcement to search by make/model/distinguishing features, rather than limiting them to plate number searches. (It also noted Flock cameras gather lots of images of people, but Flock affirmed to the court that its software could not be used to perform facial recognition or otherwise track people’s movements outside of their cars. For now, anyway…)
A week later, another city in Virginia is being sued for its network of Flock ALPR cameras. Unlike what was seen in Richmond, Virginia, where a court ruled a few hits from a reverse search of a vehicle’s description didn’t raise constitutional concerns, the flock of Flocks in Norfolk is a bit more concerning. Even the city’s top cop has admitted it’s a panopticon enabled by easy-to-use and even easier to monitor cameras provided by Flock. (h/t FourthAmendment.com)
Norfolk police chief Mark Talbot said last year, “It would be difficult to drive anywhere of any distance without running into a camera.”
But there’s more to it than just a network of ALPR cameras. Most ALPR systems are designed to notify law enforcement when plates on alert lists pass a camera. It’s a passive system that doesn’t require nor allow constant monitoring by law enforcement. The system in use in Norfolk is the opposite. While it can be utilized as a passive system that provides alerts for plate hits, the Norfolk PD has decided to use it as an active monitoring system to track people’s movements. This is from the lawsuit [PDF], filed with the assistance of the Institute for Justice:
There are no meaningful restrictions on City officers’ access to this information. Officers need only watch Flock’s orientation video and create login credentials to get access. After that, the police department requires them to log in and use Flock’s database throughout their entire shift. Although the police department’s policy requires that officers use the information for law enforcement purposes only, no one proactively monitors their use. Every City officer can search the database whenever they want for whatever they want—no need to seek advance approval.
All of this is done without a warrant. No officer ever has to establish probable cause, swear to the facts in a warrant application, and await the approval of a neutral judge. The cameras take photographs and store the information of every driver that passes them—suspect or not. The photographs and information are then available to any officer in the City to use as they see fit, for the next 30 days. And if City officials download the photos and information during that 30-day window, there are no meaningful restraints on how long they can hold them or how they may be used.
Worse still, Flock maintains a centralized database with over one billion license plate reads every month. So, even after a driver leaves the City, officers can potentially keep following them in the more than 5,000 communities where Flock currently has cameras. Likewise, any person with access to Flock’s centralized database can access the City’s information, potentially without the City even knowing about it. Ominously, the City’s police chief has said this “creates a nice curtain of technology” for the City and surrounding area.
The Fourth Amendment concerns might be the least of the issues here. This is a massive database of people’s movements being constantly refilled by the city’s 172 cameras, operated by officers who are expected to actively engage with the system, all without any credible or meaningful oversight.
The city and PD tout it as a crime fighting tool. But it’s a system that actively encourages abuse. Since it tracks every vehicle, officers can use it to track the movements of anyone they wish to track, ranging from journalists to protestors to estranged spouses to anyone they might feel like knowing more about for definitely non-law enforcement purposes. This isn’t mere speculation. The lawsuit cites a couple of past abuses of law enforcement databases by police officers. We’ve coveredseveralmore of thesecaseshere at BestNetTech over the past decade.
The argument here is that warrantless surveillance of people’s movements violates the Fourth Amendment, even if any single plate/vehicle photo is not a violation in and of itself. Whether this argument will generate favorable judicial action depends on whether or not the court decides to view the hundreds of thousands of photos captured by the 172 cameras as a whole, or whether it decides each of these hundreds of thousands is it own individual observation of a vehicle on a public road, which has never been considered a Fourth Amendment violation.
And that’s where Carpenter will come into play. The Supreme Court knew any individual data point meant nothing, especially under the Third Party Doctrine. But when combined to create a long-term record of someone’s movements, officers needed probable cause to obtain this data. In this case, the court will have to decide whether accessing this database requires a warrant.
Whatever it decides, it’s clear the Norfolk PD needs to be doing more to prevent abusive access by officers. The policies it has in place do absolutely nothing to deter misuse. The only requirement appears to be agreeing to some click-wrap that happens to include a short video officers are only obligated to press “play” on. After that, the use and application of this tech appears to be left to each officer’s discretion, which is absolutely the best way to encourage multiple indiscretions. If this lawsuit can’t actually get a warrant requirement installed, perhaps it will, at the very least, force the PD to more closely supervise officers’ use of the Flock-enabled Norfolk Panopticon.
The Supreme Court’s Carpenter decision created a warrant requirement for obtaining location data from service providers. It was a limited ruling, albeit one that has had far-ranging implications.
Thanks to this ruling, law enforcement agencies have started buying location data from third-party brokers, rather than suffer the apparent indignity of having to ask a judge to approve a warrant. The underlying theme of the ruling — that the Fourth Amendment ain’t what it used to be now that everyone’s online all the time — has seen it applied to cases where location data isn’t the underlying concern. Anything law enforcement might use to engage in tracking of individuals is now under additional scrutiny.
And while this is mostly a welcome development, it also means there’s still a lot of unsettled law that isn’t going to get settled quickly, because the confines of the ruling and the limited upheaval of the Third Party Doctrine are often in conflict.
But some cases are easier than others, even if they’re actually still a bit complicated. A recent ruling handed down by a Virginia federal court applies Carpenter to automated license plate readers. It’s not the first to do so but it might be the first that involves Flock Safety, a manufacturer of license plate readers that’s equally willing to sell to both private and government entities.
Flock first made a name by hawking its wares to gated communities and HOAs, preying on their desire to keep outsiders out and insiders subject to the restraints of their HOA agreements. Then it realized an unlimited revenue stream existed in the public sector and started courting cops. And then it realized it could leverage both of its markets by encouraging the private sector to make its ALPR shots available to cops without the friction of subpoenas or warrants.
And that’s why this decision [PDF], coming to us via FourthAmendment.com, is especially interesting. It’s not all that interesting that a court might decide the use of three ALPR/vehicle photos obtained from Flock ALPR cameras might not meet the “long-term location tracking” standard loosely applied by the Carpenter decision. What’s far more interesting about this decision is its discussion of how Flock and its network of cameras operates, especially when assisting law enforcement during investigations. It also highlights aspects of Flock camera operation users and non-users may not have been aware of.
Here’s how this worked out for robbery suspect Kumiko Martin Jr., who sought to have the Flock ALPR images obtained by investigators suppressed:
The Court is cautious to not hinder law enforcement’s use of modernizing surveillance capabilities in the public sphere lest the Court “embarrass the future.” Carpenter, 585 U.S. at 316 (quoting Nw. Airlines, Inc. v. Minnesota, 322 U.S. 292, 300 (1944) (internal quotation marks omitted)). This Court must rule on the facts as they are and may not speculate about what the future may hold for Flock’s capabilities. Today’s ruling is limited to the facts of this case as they are at the time of this ruling, including the limited number of Flock cameras in the Richmond area and the limited number of pictures taken of the exterior of Martin’s vehicle. Accessing Flock’s database, which captured only three photographs of Martin’s vehicle during the relevant 30-day period, did not allow law enforcement to track or monitor the “whole of [Martin’s] physical movements,” and therefore was not a search under the Fourth Amendment.
As the court states, this finding is limited to these facts. Future cases involving the same tech may alter the Fourth Amendment equilibrium, but for now, applying Carpenter doesn’t turn three images into long-term warrantless tracking.
But there’s a lot more information in the decision, which makes the entire thing worth reading. The first thing worth noting is that Flock’s ALPR offerings don’t limit themselves to identifying license plates. The tech does what it can to flag anything else distinctive about the vehicle and add that to its searchable database.
Unlike ALPRs, photographs by Flock cameras are uploaded in full to a Cloud database that records and stores the captured data. This searchable data includes the photograph’s date, time, and location as well as the vehicle’s license plate {and absence temporariness, or obstruction thereof), the plate’s state- and/or country-of-origin, body type. make, model, color, and other “unique identifiers” such as visible toolboxes, bumper and window stickers, roof racks, and damage to the exterior of the vehicle. Flock updates the software to provide additional metrics for use in querying and reviewing the database.
In this case, the Flock camera was mounted on a pole. This camera cannot be remotely controlled, so its operation was completely passive. Even so, its limited functions still provided more info than most ALPRs currently deployed solely by law enforcement agencies.
And while most ALPR cameras will capture at least some of a vehicle’s exterior, searchable databases of plate records tend to be limited to plate/location/time/date info. Flock’s software, however, basically allows investigators to perform reverse image searches of almost any distinguishing feature in hopes of finding hits based on vehicle description, rather than license plate numbers.
[W]hen Officer Redford queried the Flock database for vehicles that matched the description of the Dunston Robbery suspect’s car obtained from the Valero cameras, the Flock system limited its results to the 30 days preceding April 22, 2023.
This time limit is not specific to Flock. Retention periods can be indefinite. But ALPR data retention is limited by law in Virginia to 30 days, so Flock’s software applied this limit to provide the investigator with search results within the statutory limit. Nothing stated in this decision (or by Flock itself) suggests Flock doesn’t retain these records for a longer period, no matter what limitations might be placed on its government customers.
Officer Redford’s query returned 2,500 results (photographs), which is the maximum that Flock’s system shows per search. Officer Redford then manually reviewed those 2,500 pictures and found two of the suspect’s vehicle, which Officer Redford was able to identify based on the unique stickers Unlike the Valero security-camera footage, the Flock pictures also identified the vehicle as gold with a Virginia license-plate number of UAL-6525.
Given this option, it’s unsurprising law enforcement agencies are the fastest-growing market for Flock ALPR readers. This makes regular ALPRs look positively anemic in terms of technical capability. Unfortunately, it also encourages officers to do things that might be constitutionally unwise, like working backwards from a vague vehicle description to a supposed suspect by performing underinformed and ill-advised searches. It’s the same thing we’ve seen gone wrong with facial recognition tech. Now, we’re not only involving possibly innocent vehicles, but the drivers of those vehicles who will be treated as dangerous criminal suspects just because the software applied the wrong tag or an investigator leapt to conclusions not actually supported by the search results.
To be fair to Flock, it’s at least trying to curb some of the collateral damage.
The cameras are not designed to capture pictures of humans but may do so incidentally. If that occurs, however, the database does not allow searches based on biometric or other human-based characteristics that would allow law enforcement to scan for individuals.
But that’s not exactly good news. The capability to utilize these ALPRs for facial recognition is already at least partly there. At some point, Flock is going to decide it’s worth the PR hit to turn its cameras into ALPR/facial recognition hybrids, instantly doubling the issues created by its system’s ability to flag cars by description, which is far less distinct than limiting search results to license plate numbers.
Also concerning: the “share” option built into privately-owned Flock cameras doesn’t limit itself to local law enforcement agencies. Agreeing to share image captures with law enforcement means officers anywhere in the county can access data or recordings made by these cameras. This decision notes that the officer had access to 188 public and private cameras in the surrounding three counties alone.
Given what’s been put on the record about Flock cameras and their capabilities, there may be a solid case in the future that might invoke Carpenter’s warrant requirement. Unfortunately, there’s a much larger hurdle to be cleared here. Cameras only capture images of cars on public roads (at least in most cases). They do not track people’s movements outside of their vehicles. So, it’s unlikely any court will find this sort of post facto “tracking” a violation of the Fourth Amendment. But the added capabilities — and the government’s reliance on privately-owned cameras when performing searches for vehicle movements — might pose an issue later on when more precedent involving unforeseen technological developments is generated. For now, at least we know a bit more about how Flock operates and how investigators are using it to search for cars, rather than plate numbers. It’s informative, to be sure, but it’s not going to help this suspect avoid a trial on robbery charges.