“Flock Safety” may be the brand name, but this company’s earliest sales successes had nothing to with safety. Its target audience was homeowners associations and people running gated communities in upscale neighborhoods. The purpose of the cameras (and, eventually, the attached license plate reader tech) was to make sure people who were plenty safe already weren’t annoyed by occasional intrusions by the rest of the world outside of their gates.
Then it went the Ring route, offering cheap cameras to cops. It was just inkjet printers all over again. The cameras were affordable. Subscription fees for access to footage and the company’s search engine were the real moneymaker.
And, much like Ring, Flock has ended up on the wrong side of public opinion. While it hasn’t quite generated the amount of negative press Ring’s cozy relationship with cop shops has (yet!), it’s been getting eyeballed pretty fiercely by people who aren’t fans of its access-it-all-from-anywhere attitude. A report from 404 Media showed Texas law enforcement officers utilizing the nationwide network of Flock ALPR data to hunt down someone who had engaged in a medication abortion. Weeks later, it was discovered this search was performed on behalf of her vengeful boyfriend, who sought to press criminal charges against her.
Other news has surfaced as well, making Flock Safety look even worse. It has placed almost no restrictions on access by anyone from anywhere, which has resulted in a lot of local law enforcement agencies performing searches federal agencies like CBP, US Border Patrol, and ICE can’t perform themselves. In some cases, Flock’s lack of restraint and nonexistent privacy policies has made their cameras pretty much illegal. In other cases, local lawmakers are finally reining in use of this camera network due to its steady abuse by federal officers.
Police departments in Redmond and Lynnwood have temporarily shut down their Flock license plate reader systems following growing public concerns about privacy and system access, according to city officials.
Redmond’s City Council voted unanimously Monday to turn off its Automated License Plate Reader (ALPR) cameras after learning that U.S. Border Patrol improperly accessed Auburn’s Flock system last month.
Redmond’s police chief, Darrell Lowe, insists no improper/proxy access has happened on his watch. But that doesn’t mean all that much, because it’s unclear whether or not Flock Safety would inform local cops if these agencies did. For that matter, proxy searches for federal agencies generally have access to any records generated anywhere in the country. So, it’s hardly comforting to assure people your agency hasn’t been approached directly by federal officers.
That was the point Senator Ron Wyden made in his letter to Flock Safety — one in which he pointed out that Flock has zero desire to deter abuse of its camera network, much less engage in good faith discussions about how it could go about siloing its networks so searches are restricted to areas directly overseen by local law enforcement.
The police chief in Lynnwood, however, didn’t try to make excuses. He actually attempted to do something when these concerns were first raised.
“Flock cameras have already proven to be an invaluable investigative tool in solving crimes and keeping our community safe,” Lynnwood Police Chief Cole Langdon said. “However, it’s equally important that we maintain the public’s trust.”
The ALPR program in Lynnwood launched June 29, 2025, with 25 cameras funded through a Washington Auto Theft Prevention Authority grant.
Shortly after implementation, the department learned a vendor-enabled “nationwide search” feature allowed broader access than Lynnwood authorized.
Police said they worked with Flock Safety to disable that feature on July 8.
While Flock pitched in there to respect its customer’s request, it has also gone the other way just as frequently. The company has previously been caught illegally installing cameras. In September, it was caught reinstalling cameras the city of Evanston, Illinois had ordered removed because the network (and Flock’s access options) violated the state’s privacy laws.
Private surveillance vendor Flock Safety reinstalled all of its stationary license plate cameras in Evanston that had previously been removed, apparently doing so without authorization from the city, which sent the company a cease-and-desist order Tuesday afternoon demanding that the cams be taken back down.
The city previously ordered Flock to shut down 19 automated license plate readers (18 stationary and one flex camera that can be attached to a squad car) provided by the company and put its contract with Flock on a 30-day termination notice on Aug. 26.
Predictably, this push-back against Flock is generally occurring in areas already being threatened/invaded on a daily basis by the US military and swarms of federal officers. But that’s to be expected. Those most threatened by federal abuse of local camera networks are always going to be the first to fight back. The reason it’s not happening in “red” states is because the people running those states honestly don’t care what route enables authoritarianism, just so long as it does so while their party still holds power.
More than 80 law enforcement agencies across the United States have used language perpetuating harmful stereotypes against Romani people when searching the nationwide Flock Safety automated license plate reader (ALPR) network, according to audit logs obtained and analyzed by the Electronic Frontier Foundation.
When police run a search through the Flock Safety network, which links thousands of ALPR systems, they are prompted to leave a reason and/or case number for the search. Between June 2024 and October 2025, cops performed hundreds of searches for license plates using terms such as “roma” and “g*psy,” and in many instances, without any mention of a suspected crime. Other uses include “g*psy vehicle,” “g*psy group,” “possible g*psy,” “roma traveler” and “g*psy ruse,” perpetuating systemic harm by demeaning individuals based on their race or ethnicity.
These queries were run through thousands of police departments’ systems—and it appears that none of these agencies flagged the searches as inappropriate.
These searches are, by definition, racist.
Word Choices and Flock Searches
We are using the terms “Roma” and “Romani people” as umbrella terms, recognizing that they represent different but related groups. Since 2020, the U.S. federal government has officially recognized “Anti-Roma Racism” as including behaviors such as “stereotyping Roma as persons who engage in criminal behavior” and using the slur “g*psy.” According to the U.S. Department of State, this language “leads to the treatment of Roma as an alleged alien group and associates them with a series of pejorative stereotypes and distorted images that represent a specific form of racism.”
Nevertheless, police officers have run hundreds of searches for license plates using the terms “roma” and “g*psy.” (Unlike the police ALPR queries we’ve uncovered, we substitute an asterisk for the Y to avoid repeating this racist slur). In many cases, these terms have been used on their own, with no mention of crime. In other cases, the terms have been used in contexts like “g*psy scam” and “roma burglary,” when ethnicity should have no relevance to how a crime is investigated or prosecuted.
A “g*psy scam” and “roma burglary” do not exist in criminal law separate from any other type of fraud or burglary. Several agencies contacted by EFF have since acknowledged the inappropriate use and expressed efforts to address the issue internally.
“The use of the term does not reflect the values or expected practices of our department,” a representative of the Palos Heights (IL) Police Department wrote to EFF after being confronted with two dozen searches involving the term “g*psy.” “We do not condone the use of outdated or offensive terminology, and we will take this inquiry as an opportunity to educate those who are unaware of the negative connotation and to ensure that investigative notations and search reasons are documented in a manner that is accurate, professional, and free of potentially harmful language.”
Of course, the broader issue is that allowing “g*psy” or “Roma” as a reason for a search isn’t just offensive, it implies the criminalization an ethnic group. In fact, the Grand Prairie Police Department in Texas searched for “g*psy” six times while using Flock’s “Convoy” feature, which allows an agency to identify vehicles traveling together—in essence targeting an entire traveling community of Roma without specifying a crime.
At the bottom of this post is a list of agencies and the terms they used when searching the Flock system.
Anti-Roma Racism in an Age of Surveillance
Racism against Romani people has been a problem for centuries, with one of its most horrific manifestations during the Holocaust, when the Third Reich and its allies perpetuated genocide by murdering hundreds of thousands of Romani people and sterilizing thousands more. Despite efforts by the UN and EU to combat anti-Roma discrimination, this form of racism persists. As scholars Margareta Matache and Mary T. Bassett explain, it is perpetuated by modern American policing practices:
In recent years, police departments have set up task forces specialised in “G*psy crimes”, appointed “G*psy crime” detectives, and organised police training courses on “G*psy criminality”. The National Association of Bunco Investigators (NABI), an organisation of law enforcement professionals focusing on “non-traditional organised crime”, has even created a database of individuals arrested or suspected of criminal activity, which clearly marked those who were Roma.
Thus, it is no surprise that a 2020 Harvard University survey of Romani Americans found that 4 out of 10 respondents reported being subjected to racial profiling by police. This demonstrates the ongoing challenges they face due to systemic racism and biased policing.
Notably, many police agencies using surveillance technologies like ALPRs have adopted some sort of basic policy against biased policing or the use of these systems to target people based on race or ethnicity. But even when such policies are in place, an agency’s failure to enforce them allows these discriminatory practices to persist. These searches were also run through the systems of thousands of other police departments that may have their own policies and state laws that prohibit bias-based policing—yet none of those agencies appeared to have flagged the searches as inappropriate.
The Flock search data in question here shows that surveillance technology exacerbates racism, and even well-meaning policies to address bias can quickly fall apart without proper oversight and accountability.
Cops In Their Own Words
EFF reached out to a sample of the police departments that ran these searches. Here are five representative responses we received from police departments in Illinois, California, and Virginia. They do not inspire confidence.
1. Lake County Sheriff’s Office, IL
In June 2025, the Lake County Sheriff’s Office ran three searches for a dark colored pick-up truck, using the reason: “G*PSY Scam.” The search covered 1,233 networks, representing 14,467 different ALPR devices.
In response to EFF, a sheriff’s representative wrote via email:
“Thank you for reaching out and for bringing this to our attention. We certainly understand your concern regarding the use of that terminology, which we do not condone or support, and we want to assure you that we are looking into the matter.
Any sort of discriminatory practice is strictly prohibited at our organization. If you have the time to take a look at our commitment to the community and our strong relationship with the community, I firmly believe you will see discrimination is not tolerated and is quite frankly repudiated by those serving in our organization.
We appreciate you bringing this to our attention so we can look further into this and address it.”
2. Sacramento Police Department, CA
In May 2025, the Sacramento Police Department ran six searches using the term “g*psy.” The search covered 468 networks, representing 12,885 different ALPR devices.
In response to EFF, a police representative wrote:
“Thank you again for reaching out. We looked into the searches you mentioned and were able to confirm the entries. We’ve since reminded the team to be mindful about how they document investigative reasons. The entry reflected an investigative lead, not a disparaging reference.
We appreciate the chance to clarify.”
3. Palos Heights Police Department, IL
In September 2024, the Palos Heights Police Department ran more than two dozen searches using terms such as “g*psy vehicle,” “g*psy scam” and “g*psy concrete vehicle.” Most searches hit roughly 1,000 networks.
In response to EFF, a police representative said the searches were related to a singular criminal investigation into a vehicle involved in a “suspicious circumstance/fraudulent contracting incident” and is “not indicative of a general search based on racial or ethnic profiling.” However, the agency acknowledged the language was inappropriate:
“The use of the term does not reflect the values or expected practices of our department. We do not condone the use of outdated or offensive terminology, and we will take this inquiry as an opportunity to educate those who are unaware of the negative connotation and to ensure that investigative notations and search reasons are documented in a manner that is accurate, professional, and free of potentially harmful language.
We appreciate your outreach on this matter and the opportunity to provide clarification.”
4. Irvine Police Department, CA
In February and May 2025, the Irvine Police Department ran eight searches using the term “roma” in the reason field. The searches covered 1,420 networks, representing 29,364 different ALPR devices.
In a call with EFF, an IPD representative explained that the cases were related to a series of organized thefts. However, they acknowledged the issue, saying, “I think it’s an opportunity for our agency to look at those entries and to use a case number or use a different term.”
5. Fairfax County Police Department, VA
Between December 2024 and April 2025, the Fairfax County Police Department ran more than 150 searches involving terms such as “g*psy case” and “roma crew burglaries.” Fairfax County PD continued to defend its use of this language.
In response to EFF, a police representative wrote:
“Thank you for your inquiry. When conducting searches in investigative databases, our detectives must use the exact case identifiers, terms, or names connected to a criminal investigation in order to properly retrieve information. These entries reflect terminology already tied to specific cases and investigative files from other agencies, not a bias or judgment about any group of people. The use of such identifiers does not reflect bias or discrimination and is not inconsistent with our Bias-Based Policing policy within our Human Relations General Order.”
A National Trend
Roma individuals and families are not the only ones being systematically and discriminatorily targeted by ALPR surveillance technologies. For example, Flock audit logs show agencies ran 400 more searches using terms targeting Traveller communities more generally, with a specific focus on Irish Travellers, often without any mention of a crime.
Across the country, these tools are enabling and amplifying racial profiling by embedding longstanding policing biases into surveillance technologies. For example, data from Oak Park, IL, show that 84% of drivers stopped in Flock-related traffic incidents were Black—despite Black people making up only 19% of the local population. ALPR systems are far from being neutral tools for public safety and are increasingly being used to fuel discriminatory policing practices against historically marginalized people.
The racially coded language in Flock’s logs mirrors long-standing patterns of discriminatory policing. Terms like “furtive movements,” “suspicious behavior,” and “high crime area” have always been cited by police to try to justify stops and searches of Black, Latine, and Native communities. These phrases might not appear in official logs because they’re embedded earlier in enforcement—in the traffic stop without clear cause, the undocumented stop-and-frisk, the intelligence bulletin flagging entire neighborhoods as suspect. They function invisibly until a body-worn camera, court filing, or audit brings them to light. Flock’s network didn’t create racial profiling; it industrialized it, turning deeply encoded and vague language into scalable surveillance that can search thousands of cameras across state lines.
The Path Forward
U.S. Sen. Ron Wyden, D-OR, recently recommended that local governments reevaluate their decisions to install Flock Safety in their communities. We agree, but we also understand that sometimes elected officials need to see the abuse with their own eyes first.
We know which agencies ran these racist searches, and they should be held accountable. But we also know that the vast majority of Flock Safety’s clients—thousands of police and sheriffs—also allowed those racist searches to run through their Flock Safety systems unchallenged.
Elected officials must act decisively to address the racist policing enabled by Flock’s infrastructure. First, they should demand a complete audit of all ALPR searches conducted in their jurisdiction and a review of search logs to determine (a) whether their police agencies participated in discriminatory policing and (b) what safeguards, if any, exist to prevent such abuse. Second, officials should institute immediate restrictions on data-sharing through Flock’s nationwide network. As demonstrated by California law, for example, police agencies should not be able to share their ALPR data with federal authorities or out-of-state agencies, thus eliminating a vehicle for discriminatory searches spreading across state lines.
Ultimately, elected officials must terminate Flock Safety contracts entirely. The evidence is now clear: audit logs and internal policies alone cannot prevent a surveillance system from becoming a tool for racist policing. The fundamental architecture of Flock—thousands of cameras feeding into a nationwide searchable network—makes discrimination inevitable when enforcement mechanisms fail.
As Sen. Wyden astutely explained, “local elected officials can best protect their constituents from the inevitable abuses of Flock cameras by removing Flock from their communities.”
Table Overview and Notes
The following table compiles terms used by agencies to describe the reasons for searching the Flock Safety ALPR database. In a small number of cases, we removed additional information such as case numbers, specific incident details, and officers’ names that were present in the reason field.
We removed one agency from the list due to the agency indicating that the word was a person’s name and not a reference to Romani people.
In general, we did not include searches that used the term “Romanian,” although many of those may also be indicative of anti-Roma bias. We also did not include uses of “traveler” or “Traveller” when it did not include a clear ethnic modifier; however, we believe many of those searches are likely relevant.
A text-based version of the spreadsheet is available here.
I’m not saying this unholy matrimony wouldn’t have occurred under any other regime, but it’s definitely the sort of thing that plays well with the Oval Office while it’s housing Donald Trump.
Both Flock Safety and Ring have weathered plenty of negative press, largely because they were doing the sort of thing they’re going back to doing now: turning private cameras into extensions of government surveillance networks.
Flock Safety began by pitching its products to some of the most secure people in the nation: wealthy white homeowners. Flock Safety became just another way for gated communities and HOAs to keep a tab on residents while also casting a skeptical eye towards anyone (or any vehicles) those running the cameras didn’t immediately recognize.
Then it invited cops to play with its equipment and install some of their own. It went from keeping black people out of white neighborhoods to becoming a tool to be wielded by cops as they searched for a woman who had terminated a pregnancy — not because cops cared about her well-being, but at the behest of her apparently abusive boyfriend. Law enforcement investigators and officials claimed the nationwide searches for the person seeking an abortion was all about finding her safely. Even after internal documents revealed it was actually about finding her in hopes of pressing charges for violating Texas’s abortion ban, Flock Safety has continued to criticize journalists for reporting on this apparent abuse of its camera network.
Ring democratized front door surveillance, for better or worse. It gave people a cheap option for keeping crime off their literal doorstep. But it also invited cops along for the ride, giving them free cameras to hand out to citizens with the implied suggestion a free camera would result in warrantless access to footage any time the cops felt like looking at it.
Ring finally rolled back its carte blanche cop access and demanded a bit more paperwork from law enforcement before allowing it to raid its cloud storage. Flock Safety — in response to congressional criticism — made vague statements about limiting abuse of camera access by law enforcement. Of course, those words were meaningless, as Senator Ron Wyden recently pointed out in a letter to Flock Safety CEO Garret Langley:
In August, 9 News in Denver revealed that Flock granted U.S. Customs and Border Protection (CBP) access to its systems, enabling the agency to search data collected by Flock’s cameras, including using the National Lookup Tool. Officials from Flock subsequently confirmed to my office in September that the company provided access to CBP, Homeland Security Investigations (HSI), the Secret Service, and the Naval Criminal Investigative Service as part of a pilot earlier this year. Flock told my office that during the pilot, which has now ended, CBP and HSI conducted approximately 200 and 175 searches respectively. Flock also confirmed that itmisled its state and local law enforcement customers, telling my office that “due to internal miscommunication, customers were inaccurately informed that Flock did not have any relationship with DHS, while pilot programs with sub-agencies of DHS were briefly active.”
The abortion investigation described above is also mentioned in the letter, which closes with Ron Wyden telling the company that no one should trust what Flock Safety says because when it’s not misleading people, it’s both incapable and unwilling to place meaningful restrictions on law enforcement access to its nationwide network of cameras:
The privacy protection that Flock promised to Oregonians — that Flock software will automatically examine the reason provided by law enforcement officers for terms indicating an abortion- or immigration-related search — is meaningless when law enforcement officials provide generic reasons like “investigation” or “crime.” Likewise, Flock’s filters are meaningless if no reason for a search is provided in the first place. While the search reasons collected by Flock, obtained by press and activists through open records requests, have occasionally revealed searches for immigration and abortion enforcement, these are likely just the tip of the iceberg. Presumably, most officers using Flock to hunt down immigrants and women who have received abortions are not going to type that in as the reason for their search. And, regardless, given that Flock has washed its hands of any obligation to audit its customers, Flock customers have no reason to trust a search reason provided by another agency.
I now believe that abuses of your product are not only likely but inevitable, and that Flock is unable and uninterested in preventing them.
Law enforcement agencies will soon have easier access to footage captured by Amazon’s Ring smart cameras. In a partnership announced this week, Amazon will allow approximately 5,000 local law enforcement agencies to request access to Ring camera footage via surveillance platforms from Flock Safety.
[…]
According to Flock’s announcement, its Ring partnership allows local law enforcement members to use Flock software “to send a direct post in the Ring Neighbors app with details about the investigation and request voluntary assistance.” Requests must include “specific location and timeframe of the incident, a unique investigation code, and details about what is being investigated,” and users can look at the requests anonymously, Flock said.
[…]
Flock said its local law enforcement users will gain access to Ring Community Requests in “the coming months.”
We absolutely didn’t need these two major players in the private surveillance market to team up and offer expanded access to US law enforcement — especially when so much of US law enforcement is focused on the “criminal” acts listed in Wyden’s letter: abortions and immigration.
According to Ars Technica’s reporting, Ring is the most active participant in this new surveillance dragnet. First, Ring rolled back its promise to limit law enforcement access to Ring footage by partnering with Axon, a heavy-hitter in the US body camera marketplace. Then it decided to court one of the rivals in its own marketplace, which means both companies can still pretend to hold unique ideals while ensuring the bastard child of this coupling will render those ideals irrelevant.
Flock says that its cameras don’t use facial recognition, which has been criticized for racial biases. But local law enforcement agencies using Flock will soon have access to footage from Ring cameras with facial recognition.
Both companies will be able to blame each other the next time abusive access is revealed. And Ring’s network will presumably gain features it doesn’t have currently via its meshing with Flock, like license plate recognition and an algorithm that can be applied to Ring footage that allows cops to do things it can’t with Ring alone, like search for suspects using nothing but vehicle or clothing descriptions.
And this assurance is especially meaningless, given what’s already known about both of these companies:
Amazon and Flock say their collaboration will only involve voluntary customers and local enforcement agencies.
When both companies store recordings in their own clouds, “voluntary” is beside the point. Law enforcement can just approach either company directly with warrants or subpoenas and get what has been denied to them by these companies’ customers. And restraining searches to “local law enforcement” agencies is impossible if neither company is interested in limiting searches to local areas and/or taking steps to prevent local agencies from performing searches on behalf of federal officers.
Even if both companies take heat for doing this, they’ll still do it. After all, they’ve got an entire administration standing behind them that’s willing to call anyone who questions or criticizes this unofficial merger a friend of criminals, if not an actual enemy of the nation.
New documents and court records obtained by EFF show that Texas deputies queried Flock Safety’s surveillance data in an abortion investigation, contradicting the narrative promoted by the company and the Johnson County Sheriff that she was “being searched for as a missing person,” and that “it was about her safety.”
The new information shows that deputies had initiated a “death investigation” of a “non-viable fetus,” logged evidence of a woman’s self-managed abortion, and consulted prosecutors about possibly charging her.
Johnson County Sheriff Adam King repeatedly denied the automated license plate reader (ALPR) search was related to enforcing Texas’s abortion ban, and Flock Safety called media accounts “false,” “misleading” and “clickbait.” However, according to a sworn affidavit by the lead detective, the case was in fact a death investigation in response to a report of an abortion, and deputies collected documentation of the abortion from the “reporting person,” her alleged romantic partner. The death investigation remained open for weeks, with detectives interviewing the woman and reviewing her text messages about the abortion.
The documents show that the Johnson County District Attorney’s Office informed deputies that “the State could not statutorily charge [her] for taking the pill to cause the abortion or miscarriage of the non-viable fetus.”
An excerpt from the JCSO detective’s sworn affidavit.
The records include previously unreported details about the case that shocked public officials and reproductive justice advocates across the country when it was first reported by 404 Media in May. The case serves as a clear warning sign that when data from ALPRs is shared across state lines, it can put people at risk, including abortion seekers. And, in this case, the use may have run afoul of laws in Washington and Illinois.
A False Narrative Emerges
Last May, 404 Media obtained data revealing the Johnson County Sheriff’s Office conducted a nationwide search of more than 83,000 Flock ALPR cameras, giving the reason in the search log: “had an abortion, search for female.” Both the Sheriff’s Office and Flock Safety have attempted to downplay the search as akin to a search for a missing person, claiming deputies were only looking for the woman to “check on her welfare” and that officers found a large amount of blood at the scene – a claim now contradicted by the responding investigator’s affidavit. Flock Safety went so far as to assert that journalists and advocates covering the story intentionally misrepresented the facts, describing it as “misreporting” and “clickbait-driven.”
As Flock wrote of EFF’s previous commentary on this case (bold in original statement):
Earlier this month, there was purposefully misleading reporting that a Texas police officer with the Johnson County Sheriff’s Office used LPR “to target people seeking reproductive healthcare.” This organization is actively perpetuating narratives that have been proven false, even after the record has been corrected.
According to the Sheriff in Johnson County himself, this claim is unequivocally false.
… No charges were ever filed against the woman and she was never under criminal investigation by Johnson County. She was being searched for as a missing person, not as a suspect of a crime.
That sheriff has since been arrested and indicted on felony counts in an unrelated sexual harassment and whistleblower retaliation case. He has also been charged with aggravated perjury for allegedly lying to a grand jury. EFF filed public records requests with Johnson County to obtain a more definitive account of events.
The newly released incident report and affidavit unequivocally describe the case as a “death investigation” of a “non-viable fetus.” These documents also undermine the claim that the ALPR search was in response to a medical emergency, since, in fact, the abortion had occurred more than two weeks before deputies were called to investigate.
In recent years, anti-abortion advocates and prosecutors have increasingly attempted to use “fetal homicide” and “wrongful death” statutes – originally intended to protect pregnant people from violence – to criminalize abortion and pregnancy loss. These laws, which exist in dozens of states, establish legal personhood of fetuses and can be weaponized against people who end their own pregnancies or experience a miscarriage.
In fact, a new report from Pregnancy Justice found that in just the first two years since the Supreme Court’s decision in Dobbs, prosecutors initiated at least 412 cases charging pregnant people with crimes related to pregnancy, pregnancy loss, or birth–most under child neglect, endangerment, or abuse laws that were never intended to target pregnant people. Nine cases included allegations around individuals’ abortions, such as possession of abortion medication or attempts to obtain an abortion–instances just like this one. The report also highlights how, in many instances, prosecutors use tangentially related criminal charges to punish people for abortion, even when abortion itself is not illegal.
By framing their investigation of a self-administered abortion as a “death investigation” of a “non-viable fetus,” Texas law enforcement was signaling their intent to treat the woman’s self-managed abortion as a potential homicide, even though Texas law does not allow criminal charges to be brought against an individual for self-managing their own abortion.
The Investigator’s Sworn Account
Over two days in April, the woman went through the process of taking medication to induce an abortion. Two weeks later, her partner–who would later be charged with domestic violence against her–reported her to the sheriff’s office.
The documents confirm that the woman was not present at the home when the deputies “responded to the death (Non-viable fetus).” As part of the investigation, officers collected evidence that the man had assembled of the self-managed abortion, including photographs, the FedEx envelope the medication arrived in, and the instructions for self-administering the medication.
Another Johnson County official ran two searches through the ALPR database with the note “had an abortion, search for female,” according to Flock Safety search logs obtained by EFF. The first search, which has not been previously reported, probed 1,295 Flock Safety networks–composed of 17,684 different cameras–going back one week. The second search, which was originally exposed by 404 Media, was expanded to a full month of data across 6,809 networks, including 83,345 cameras. Both searches listed the same case number that appears on the death investigation/incident report obtained by EFF.
After collecting the evidence from the woman’s partner, the investigators say they consulted the district attorney’s office, only to be told they could not press charges against the woman.
An excerpt from the JCSO detective’s sworn affidavit.
Nevertheless, when the subject showed up at the Sheriff’s office a week later, officers were under the impression that she came to “to tell her side of the story about the non-viable fetus.” They interviewed her, inspected text messages about the abortion on her phone, and watched her write a timeline of events.
Only after all that did they learn that she actually wanted to report a violent assault by her partner–the same individual who had called the police to report her abortion. She alleged that less than an hour after the abortion, he choked her, put a gun to her head, and made her beg for her life. The man was ultimately charged in connection with the assault, and the case is ongoing.
This documented account runs completely counter to what law enforcement and Flock have said publicly about the case.
Johnson County Sheriff Adam King told 404 media: “Her family was worried that she was going to bleed to death, and we were trying to find her to get her to a hospital.” He later told the Dallas Morning News: “We were just trying to check on her welfare and get her to the doctor if needed, or to the hospital.”
The account by the detective on the scene makes no mention of concerned family members or a medical investigator. To the contrary, the affidavit says that they questioned the man as to why he “waited so long to report the incident,” and he responded that he needed to “process the event and call his family attorney.” The ALPR search was recorded 2.5 hours after the initial call came in, as documented in the investigation report.
The Desk Sergeant’s Report—One Month Later
EFF obtained a separate “case supplemental report” written by the sergeant who says he ran the May 9 ALPR searches.
The sergeant was not present at the scene, and his account was written belatedly on June 5, almost a month after the incident and nearly a week after 404 Media had already published the sheriff’s alternative account of the Flock Safety search, kicking off a national controversy. The sheriff’s office provided this sergeant’s report to Dallas Morning News.
In the report, the sergeant claims that the officers on the ground asked him to start “looking up” the woman due to there being “a large amount of blood” found at the residence—an unsubstantiated claim that is in conflict with the lead investigator’s affidavit. The sergeant repeatedly expresses that the situation was “not making sense.” He claims he was worried that the partner had hurt the woman and her children, so “to check their welfare,” he used TransUnion’s TLO commercial investigative database system to look up her address. Once he identified her vehicle, he ran the plate through the Flock database, returning hits in Dallas.
Two abortion-related searches in the JCSO’s Flock Safety ALPR audit log
The sergeant’s report, filed after the case attracted media attention, notably omits any mention of the abortion at the center of the investigation, although it does note that the caller claimed to have found a fetus. The report does not explain, or even address, why the sergeant used the phrase “had an abortion, search for female” as the official reason for the ALPR searches in the audit log.
It’s also unclear why the sergeant submitted the supplemental report at all, weeks after the incident. By that time, the lead investigator had already filed a sworn affidavit that contradicted the sergeant’s account. For example, the investigator, who was on the scene, does not describe finding any blood or taking blood samples into evidence, only photographs of what the partner believed to be the fetus.
One area where they concur: both reports are clearly marked as a “death investigation.”
Correcting the Record
Since 404 Media first reported on this case, King has perpetuated the false narrative, telling reporters that the woman was never under investigation, that officers had not considered charges against her, and that “it was all about her safety.”
But here are the facts:
The reports that have been released so far describe this as a death investigation.
The lead detective described himself as “working a death investigation… of a non-viable fetus” at the time he interviewed the woman (a week after the ALPR searches).
The detective wrote that they consulted the district attorney’s office about whether they could charge her for “taking the pill to cause the abortion or miscarriage of the non-viable fetus.” They were told they could not.
Investigators collected a lot of data, including photos and documentation of the abortion, and ran her through multiple databases. They even reviewed her text messages about the abortion.
The death investigation was open for more than a month.
The death investigation was only marked closed in mid-June, weeks after 404 Media’s article and a mere days before the Dallas Morning News published its report, in which the sheriff inaccurately claimed the woman “was not under investigation at any point.”
Flock has promoted this unsupported narrative on its blog and in multimediaappearances. We did not reach out to Flock for comment on this article, as their communications director previously told us the company will not answer our inquiries until we “correct the record and admit to your audience that you purposefully spread misinformation which you know to be untrue” about this case.
Consider the record corrected: It turns out the truth is even more damning than initially reported.
The Aftermath
In the aftermath of the original reporting, government officials began to take action. The networks searched by Johnson County included cameras in Illinois and Washington state, both states where abortion access is protected by law. Since then:
The Illinois Secretary of State has announced his intent to “crack down on unlawful use of license plate reader data,” and urged the state’s Attorney General to investigate the matter.
In California, which also has prohibitions on sharing ALPR out of state and for abortion-ban enforcement, the legislature cited the case in support of pending legislation to restrict ALPR use.
Ranking Members of the House Oversight Committee and one of its subcommittees launched a formal investigation into Flock’s role in “enabling invasive surveillance practices that threaten the privacy, safety, and civil liberties of women, immigrants, and other vulnerable Americans.”
Senator Ron Wyden secured a commitment from Flock to protect Oregonians’ data from out-of-state immigration and abortion-related queries.
In response to mounting pressure, Flock announced a series of new features supposedly designed to prevent future abuses. These include blocking “impermissible” searches, requiring that all searches include a “reason,” and implementing AI-driven audit alerts to flag suspicious activity. But as we’ve detailed elsewhere, these measures are cosmetic at best—easily circumvented by officers using vague search terms or reusing legitimate case numbers. The fundamental architecture that enabled the abuse remains unchanged.
Meanwhile, as the news continued to harm the company’s sales, Flock CEO Garrett Langley embarked on a press tour to smear reporters and others who had raised alarms about the usage. In an interview with Forbes, he even doubled down and extolled the use of the ALPR in this case.
So when I look at this, I go “this is everything’s working as it should be.” A family was concerned for a family member. They used Flock to help find her, when she could have been unwell. She was physically okay, which is great. But due to the political climate, this was really good clickbait.
Nothing about this is working as it should, but it is working as Flock designed.
The Danger of Unchecked Surveillance
This case reveals the fundamental danger of allowing companies like Flock Safety to build massive, interconnected surveillance networks that can be searched across state lines with minimal oversight. When a single search query can access more than 83,000 cameras spanning almost the entire country, the potential for abuse is staggering, particularly when weaponized against people seeking reproductive healthcare.
The searches in this case may have violated laws in states like Washington and Illinois, where restrictions exist specifically to prevent this kind of surveillance overreach. But those protections mean nothing when a Texas deputy can access cameras in those states with a few keystrokes, without external review that the search is legal and legitimate under local law. In this case, external agencies should have seen the word “abortion” and questioned the search, but the next time an officer is investigating such a case, they may use a more vague or misleading term to justify the search. In fact, it’s possible it has already happened.
ALPRs were marketed to the public as tools to find stolen cars and locate missing persons. Instead, they’ve become a dragnet that allows law enforcement to track anyone, anywhere, for any reason—including investigating people’s healthcare decisions. This case makes clear that neither the companies profiting from this technology nor the agencies deploying it can be trusted to tell the full story about how it’s being used.
States must ban law enforcement from using ALPRs to investigate healthcare decisions and prohibit sharing data across state lines. Local governments may try remedies like reducing data retention period to minutes instead of weeks or months—but, really, ending their ALPR programs altogether is the strongest way to protect their most vulnerable constituents. Without these safeguards, every license plate scan becomes a potential weapon against a person seeking healthcare.
Flock Safety, the police technology company most notable for their extensive network of automated license plate readers spread throughout the United States, is rolling out a new and troubling product that may create headaches for the cities that adopt it: detection of “human distress” via audio. As part of their suite of technologies, Flock has been pushing Raven, their version of acoustic gunshot detection. These devices capture sounds in public places and use machine learning to try to identify gunshots and then alert police—but EFF has long warned that they are also high powered microphones parked above densely-populated city streets. Cities now have one more reason to follow the lead of many other municipalities and cancel their Flock contracts, before this new feature causes civil liberties harms to residents and headaches for cities.
In marketing materials, Flock has been touting new features to their Raven product—including the ability of the device to alert police based on sounds, including “distress.” The online ad for the product, which allows cities to apply for early access to the technology, shows the image of police getting an alert for “screaming.”
It’s unclear how this technology works. For acoustic gunshot detection, generally the microphones are looking for sounds that would signify gunshots (though in practice they often mistake car backfires or fireworks for gunshots). Flock needs to come forward now with an explanation of exactly how their new technology functions. It is unclear how these devices will interact with state “eavesdropping” laws that limit listening to or recording the private conversations that often take place in public.
Flock is no stranger to causing legal challenges for the cities and states that adopt their products. In Illinois, Flock was accused of violating state law by allowing Immigration and Customs Enforcement (ICE), a federal agency, access to license plate reader data taken within the state. That’s not all. In 2023, a North Carolina judge halted the installation of Flock cameras statewide for operating in the state without a license. When the city of Evanston, Illinois recently canceled its contract with Flock, it ordered the company to take down their license plate readers–only for Flock to mysteriously reinstall them a few days later. This city has now sent Flock a cease and desist order and in the meantime, has put black tape over the cameras. For some, the technology isn’t worth its mounting downsides. As one Illinois village trustee wrote while explaining his vote to cancel the city’s contract with Flock, “According to our own Civilian Police Oversight Commission, over 99% of Flock alerts do not result in any police action.”
Gunshot detection technology is dangerous enough as it is—police showing up to alerts they think are gunfire only to find children playing with fireworks is a recipe for innocent people to get hurt. This isn’t hypothetical: in Chicago a child really was shot at by police who thought they were responding to a shooting thanks to a ShotSpotter alert. Introducing a new feature that allows these pre-installed Raven microphones all over cities to begin listening for human voices in distress is likely to open up a whole new can of unforeseen legal, civil liberties, and even bodily safety consequences.
Langley offers a prediction: In less than 10 years, Flock’s cameras, airborne and fixed, will eradicate almost all crime in the U.S.
That would be Flock Safety CEO (and co-founder) Garrett Langley speaking to Thomas Brewster of Forbes. Flock Safety has grown a lot over the past few years, following paths paved by Amazon’s doorbell surveillance camera acquisition, Ring, and other upstarts in the public/private surveillance mesh network field.
Like Ring, Flock has sold a bunch of products to regular people, starting with the people most likely to have discretionary income and the desire to wield that against other humans beings: homeowners associations and residents of gated communities.
Like Ring, Flock has allowed racists to convert their bigotry into action. And it has also allowed (and encouraged) law enforcement agencies to treat privately-owned cameras as extensions of their own surveillance networks.
Not content to add license plate reader tech to cameras owned by non-cops, Flock now wants to fill the air with another mesh network of public/private ownership via its latest offering:
Since its founding in 2017, Flock, which was valued at $7.5 billion in its most recent funding round, has quietly built a network of more than 80,000 cameras pointed at highways, thoroughfares and parking lots across the U.S. They record not just the license plate numbers of the cars that pass them, but their make and distinctive features—broken windows, dings, bumper stickers. Langley estimates its cameras help solve 1 million crimes a year. Soon they’ll help solve even more. In August, Flock’s cameras will take to the skies mounted on its own “made in America” drones.
Sure, Flock and its CEO may be worth billions. But that doesn’t actually make Langley smart. It just makes him opportunistic enough to take advantage of perpetual false narratives (some perpetuated by Flock itself!) about crime rates. Some people say “Orwellian.” Others, like Garrett Langley, just say “year-over-year growth.”
“I’ve talked to plenty of activists who think crime is just the cost of modern society. I disagree,” Langley says. “I think we can have a crime-free city and civil liberties. . . . We can have it all.” In municipalities in which Flock is deployed, he adds, the average criminal—those between 16 and 24 committing nonviolent crime—“will most likely get caught.”
And there it is: a person in the surveillance tech business refusing to discuss civil liberty concerns honestly, choosing instead to wave them away with a statement that indicates anything that stands in the way of Flock’s continued profitability (or the pipe dream of removing any and all crime and/or 16-24-year-old citizens from US streets) isn’t worth his attention.
But maybe he should be paying more attention to the law, especially the stuff about civil liberties. His company has already been accused of ignoring local laws while selling and/or installing cameras. And Flock recently got dragged back into whatever the opposite of the limelight is earlier this year, when it was discovered Texas cops were able to access Flock license plate reader data all over the nation as they tried to locate a Texas resident who had apparently left the state to obtain an abortion — something that’s illegal in Texas.
Flock didn’t have much to say about this turn of events at the time. And the officers who performed the search claimed their only interest was in locating this person to ensure she was safe, even as they work for a state whose anti-abortion laws make extremely clear that the state government doesn’t actually care about the safety of women.
Meanwhile, in Illinois, Flock is rapidly backpedaling on its information sharing agreements after multiple lawmakers alleged the company broke state data privacy laws by allowing pretty much any government agency from anywhere in the nation to access Flock ALPR data.
Flock Safety, whose cameras are mounted in more than 4,000 communities nationwide, put a hold last week on pilot programs with the Department of Homeland Security’s Customs and Border Protection and its law enforcement arm, Homeland Security Investigations, according to a statement by its founder and CEO, Garrett Langley.
Among officials in other jurisdictions, Illinois Secretary of State Alexi Giannoulias raised concerns. He announced Monday that an audit found Customs and Border Protection had accessed Illinois data, although he didn’t say that the agency was seeking immigration-related information. A 2023 law the Democrat pushed bars sharing license plate data with police investigating out-of-state abortions or undocumented immigrants.
“This sharing of license plate data of motorists who drive on Illinois roads is a clear violation of the state law,” Giannoulias said in a statement. “This law, passed two years ago, aimed to strengthen how data is shared and prevent this exact thing from happening,”
That has led to contracts being cancelled in the state of Illinois, which certainly isn’t going to contribute to Langley’s fantasies of “ending crime” via massively profitable mass surveillance systems sold by his company.
Oak Park voted to terminate its contract with Flock earlier this month.
Tuesday, following the state’s audit, the city of Evanston did the same, saying in a statement, in part:
“The findings of the Illinois Secretary of State’s audit, combined with Flock’s admission that it failed to establish distinct permissions and protocols to ensure local compliance while running a pilot program with federal users, are deeply troubling. As a result, the City has deactivated the cameras and issued a termination notice to Flock, effective September 26, 2025.”
And it’s not just limited to a state with some of the most robust privacy laws in the nation. It’s also happening in Texas, the same state that kicked this backlash off when officers decided it was okay to use Flock’s system to try to locate someone who might have been considering violating the state’s anti-abortion laws.
Austin organizers turned out to rebuke the city’s misguided contract with Flock Safety— and won. This successful pushback from the community means at the end of the month Austin police will no longer be able to use the surveillance network of automated license plate readers (ALPRs) across the city.
It’s pretty rich to claim you can stop all crime when you can’t even stop breaking the law or enabling users of your tech to break the law. What’s listed above are the words of a salesman, not a person truly concerned about crime rates or making communities safer. It thrives on the American constant of believing crime rates are really worse than they actually are. And it depends on the wealth of people and governments who also feel civil liberties are privileges that should only be enjoyed by the richest (and, obviously, whitest) people in the nation. Everyone else should just learn to accept their loss of rights and privacy gratefully, for the good of the nation.
Two recent statements from the surveillance company—one addressing Illinois privacy violations and another defending the company’s national surveillance network—reveal a troubling pattern: when confronted by evidence of widespread abuse, Flock Safety has blamed users, downplayed harms, and doubled down on the very systems that enabled the violations in the first place.
Flock’s aggressive public relations campaign to salvage its reputation comes as no surprise. Last month, we described how investigative reporting from 404 Media revealed that a sheriff’s office in Texas searched data from more than 83,000 automated license plate reader (ALPR) cameras to track down a woman suspected of self-managing an abortion. (A scenario that may have been avoided, it’s worth noting, had Flock taken action when they were first warned about this threat three years ago).
Flock calls the reporting on the Texas sheriff’s office “purposefully misleading,” claiming the woman was searched for as a missing person at her family’s request rather than for her abortion. But that ignores the core issue: this officer used a nationwide surveillance dragnet (again: over 83,000 cameras) to track someone down, and used her suspected healthcare decisions as a reason to do so. Framing this as concern for her safety plays directly into anti-abortion narratives that depict abortion as dangerous and traumatic in order to justify increased policing, criminalization, control—and, ultimately, surveillance.
As if that weren’t enough, the company has also come under fire for how its ALPR network data is being actively used to assist in mass deportation. Despite U.S. Immigration and Customs Enforcement (ICE) having no formal agreement with Flock Safety, public records revealed “more than 4,000 nation and statewide lookups by local and state police done either at the behest of the federal government or as an ‘informal’ favor to federal law enforcement, or with a potential immigration focus.” The network audit data analyzed by 404 exposed an informal data-sharing environment that creates an end-run around oversight and accountability measures: federal agencies can access the surveillance network through local partnerships without the transparency and legal constraints that would apply to direct federal contracts.
Flock Safety is adamant this is “not Flock’s decision,” and by implication, not their fault. Instead, the responsibility lies with each individual local law enforcement agency. In the same breath, they insist that data sharing is essential, loudly claiming credit when the technology is involved in cross-jurisdictional investigations—but failing to show the same attitude when that data-sharing ecosystem is used to terrorize abortion seekers or immigrants.
Flock Safety: The Surveillance Social Network
In growing from a 2017 startup to a $7.5 billion company “serving over 5,000 communities,” Flock allowed individual agencies wide berth to set and regulate their own policies. In effect, this approach offered cheap surveillance technology with minimal restrictions, leaving major decisions and actions in the hands of law enforcement while the company scaled rapidly.
And they have no intention of slowing down. Just this week, Flock launched its Business Network, facilitating unregulated data sharing amongst its private sector security clients. “For years, our law enforcement customers have used the power of a shared network to identify threats, connect cases, and reduce crime. Now, we’re extending that same network effect to the private sector,” Flock Safety’s CEO announced.
The company is building out a new mass surveillance network using the exact template that ended with the company having to retrain thousands of officers in Illinois on how not to break state law—the same template that made it easy for officers to do so in the first place. Flock’s continued integration of disparate surveillance networks across the public and private spheres—despite the harms that have already occurred—is owed in part to the one thing that it’s gotten really good at over the past couple of years: facilitating a surveillance social network.
Employing marketing phrases like “collaboration” and “force multiplier,” Flock encourages as much sharing as possible, going as far as to claim that network effects can significantly improve case closure rates. They cultivate a sense of shared community and purpose among users so they opt into good faith sharing relationships with other law enforcement agencies across the country. But it’s precisely that social layer that creates uncontrollable risk.
The possibility of human workarounds at every level undermines any technical safeguards Flock may claim. Search term blocking relies on officers accurately labeling search intent—a system easily defeated by entering vague reasons like “investigation” or incorrect justifications, made either intentionally or not. And, of course, words like “investigation” or “missing person” can mean virtually anything, offering no value to meaningful oversight of how and for what the system is being used. Moving forward, sheriff’s offices looking to avoid negative press can surveil abortion seekers or immigrants with ease, so long as they use vague and unsuspecting reasons.
The same can be said for case number requirements, which depend on manual entry. This can easily be circumvented by reusing legitimate case numbers for unauthorized searches. Audit logs only track inputs, not contextual legitimacy. Flock’s proposed AI-driven audit alerts, something that may be able to flag suspicious activity after searches (and harm) have already occurred, relies on local agencies to self-monitor misuse—despite their demonstrated inability to do so.
And, of course, even the most restrictive department policy may not be enough. Austin, Texas, had implemented one of the most restrictive ALPR programs in the country, and the program still failed: the city’s own audit revealed systematic compliance failures that rendered its guardrails meaningless. The company’s continued appeal to “local policies” means nothing when Flock’s data-sharing network does not account for how law enforcement policies, regulations, and accountability vary by jurisdiction. You may have a good relationship with your local police, who solicit your input on what their policy looks like; you don’t have that same relationship with hundreds or thousands of other agencies with whom they share their data. So if an officer on the other side of the country violates your privacy, it’d be difficult to hold them accountable.
ALPR surveillance systems are inherently vulnerable to both technical exploitation and human manipulation. These vulnerabilities are not theoretical—they represent real pathways for bad actors to access vast databases containing millions of Americans’ location data. When surveillance databases are breached, the consequences extend far beyond typical data theft—this information can be used to harass, stalk, or even extort. The intimate details of people’s daily routines, their associations, and their political activities may become available to anyone with malicious intent. Flock operates as a single point of failure that can compromise—and has compromised—the privacy of millions of Americans simultaneously.
Don’t Stop de-Flocking
Rather than addressing legitimate concerns about privacy, security, and constitutional rights, Flock has only promised updates that fall short of meaningful reforms. These software tweaks and feature rollouts cannot assuage the fear engendered by the massive surveillance system it has built and continues to expand.
Flock’s insistence that what’s happening with abortion criminalization and immigration enforcement has nothing to do with them—that these are just red-state problems or the fault of rogue officers—is concerning. Flock designed the network that is being used, and the public should hold them accountable for failing to build in protections from abuse that cannot be easily circumvented.
Thankfully, that’s exactly what’s happening: cities like Austin, San Marcos, Denver, Norfolk, and San Diego are pushing back. And it’s not nearly as hard a choice as Flock would have you believe: Austinites are weighing the benefits of a surveillance system that generates a hit less than 0.02% of the time against the possibility that scanning 75 million license plates will result in an abortion seeker being tracked down by police, or an immigrant being flagged by ICE in a so-called “sanctuary city.” These are not hypothetical risks. It is already happening.
Given how pervasive, sprawling, and ungovernable ALPR sharing networks have become, the only feature update we can truly rely on to protect people’s rights and safety is no network at all. And we applaud the communities taking decisive action to dismantle its surveillance infrastructure.
Here’s yet another worrying development in the world of privately-owned security cameras. Flock Safety has made aggressive in-roads in both the private and public sector, something aided greatly by the company’s ability to blend the two.
Much like Ring before it, Flock is pitching cheap cameras with local law enforcement buy-in, nudging residents towards leaving their cameras (some of which have license plate reader capabilities) open so law enforcement can search their plate captures without a warrant. Law enforcement agencies are also buying their own cameras to ensure people can’t travel very far without leaving at least a temporary record of their travels the government can access pretty much at will.
And this is how that meshing of public-private is playing out in real life. As Joseph Cox and Jason Koebler report for 404 Media, at least one law enforcement officer has used this meshed network of Flock ALPR cameras to help locate a woman who recently had an abortion.
On May 9, an officer from the Johnson County Sheriff’s Office in Texas searched Flock cameras and gave the reason as “had an abortion, search for female,” according to the multiple sets of data. Whenever officers search Flock cameras they are required to provide a reason for doing so, but generally do not require a warrant or any sort of court order. Flock cameras continually scan the plates, color, and model of any vehicle driving by, building a detailed database of vehicles and by extension peoples’ movements.
Cops are able to search cameras acquired in their own district, those in their state, or those in a nationwide network of Flock cameras. That single search for the woman spread across 6,809 different Flock networks, with a total of 83,345 cameras, according to the data. The officer looked for hits over a month long period, it shows.
Some of these cameras were likely owned and operated by private purchasers. But even with those excluded, it’s still a massive data set the government can access without having to offer up much in the way of justification. The justification here (one that was reflected in access audits from Flock systems located as far away as Washington state) seems especially ominous and especially flimsy: “had an abortion, search for female.”
The Johnson County Sheriff’s Office claims this search was performed to help, not harm.
Sheriff Adam King of the Johnson County Sheriff’s Office told 404 Media in a phone call that the woman self-administered the abortion “and her family was worried that she was going to bleed to death, and we were trying to find her to get her to a hospital.”
“We weren’t trying to block her from leaving the state or whatever to get an abortion,” he said. “It was about her safety.”
Even if that’s completely true, it’s not that comforting to know Texas law enforcement officers can perform the same searches for the purpose of prosecuting people who have sought abortions in nearby states where this is still legal. The justifications offered during the acquisition process always stresses the equipment will be used to deal with the most violent crimes. While utilizing the tech to search for a missing person is something most people would find acceptable, its proximity to the state’s recent abortion ban definitely isn’t an encouraging sign.
If these tools can be used this way, you can guarantee they will be used this way. Once one law enforcement agency gets the ball rolling on abortion arrests and weathers the press storm that it will provoke, the rest will follow suit, especially in areas populated by prosecutors with anti-abortion beliefs. Companies like Flock will just make everything easier for people looking to punish women for daring to explore their options and retain what’s left of their bodily autonomy.
When your local police department buys one piece of surveillance equipment, you can easily expect that the company that sold it will try to upsell them on additional tools and upgrades.
At the end of the day, public safety vendors are tech companies, and their representatives are salespeople using all the tricks from the marketing playbook. But these companies aren’t just after public money—they also want data.
And each new bit of data that police collect contributes to a pool of information to which the company can attach other services: storage, data processing, cross-referencing tools, inter-agency networking, and AI analysis. The companies may even want the data to train their own AI model. The landscape of the police tech industry is changing, and companies that once specialized in a single technology (such as hardware products like automated license plate readers (ALPRs) or gunshot detection sensors) have developed new capabilities or bought up other tech companies and law enforcement data brokers—all in service of becoming the corporate giant that serves as a one-stop shop for police surveillance needs.
One of the most alarming trends in policing is that companies are regularly pushing police to buy more than they need. Vendors regularly pressure police departments to lock in the price now for a whole bundle of features and tools in the name of “cost savings,” often claiming that the cost à la carte for any of these tools will be higher than the cost of a package, which they warn will also be priced more expensively in the future. Market analysts have touted the benefits of creating “moats” between these surveillance ecosystems and any possible competitors. By making it harder to switch service providers due to integrated features, these companies can lock their cop customers into multi-year subscriptions and long-term dependence.
Think your local police are just getting body-worn cameras (BWCs) to help with public trust or ALPRs to aid their hunt for stolen vehicles? Don’t assume that’s the end of it. If there’s already a relationship between a company and a department, that department is much more likely to get access to a free trial of whatever other device or software that company hopes the department will put on its shopping list.
These vendors also regularly help police departments apply for grants and waivers, and provide other assistance to find funding, so that as soon as there’s money available for a public safety initiative, those funds can make their way directly to their business.
Companies like Axon have been particularly successful at using their relationships and leveraging the ability to combine equipment into receiving “sole source” designations. Typically, government agencies must conduct a bidding process when buying a new product, be it toilet paper, computers, or vehicles. For a company to be designated a sole-source provider, it is supposed to provide a product that no other vendor can provide. If a company can get this designation, it can essentially eliminate any possible competition for particular government contracts. When Axon is under consideration as a vendor for equipment like BWCs, for which there are multiple possible other providers, it’s not uncommon to see a police department arguing for a sole-source procurement for Axon BWCs based on the company’s ability to directly connect their cameras to the Fusus system, another Axon product.
Here are a few of the big players positioning themselves to collect your movements, analyze your actions, and make you—the taxpayer—bear the cost for the whole bundle of privacy invasions.
Axon Enterprise’s ‘Suite’
Axon expects to have yet another year of $2 billion-plus in revenue in 2025. The company first got its hooks into police departments through the Taser, the electric stun gun. Axon then plunged into the BWC market amidst Obama-era outrage at police brutality and the flood of grant money flowing from the federal government to local police departments for BWCs, which were widely promoted as a police accountability tool. Axon parlayed its relationships with hundreds of police departments and capture and storage of growing terabytes of police footage into a menu of new technological offerings.
In its annual year-end securities filing, Axon told investors it was “building the public safety operating system of the future” through its suite of “cloud-hosted digital evidence management solutions, productivity and real-time operations software, body cameras, in-car cameras, TASER energy devices, robotic security and training solutions” to cater to agencies in the federal, corrections, justice, and security sectors.”
Axon controls an estimated 85 percent of the police body-worn camera market. Its Evidence.com platform, once a trial add-on for BWC customers, is now also one of the biggest records management systems used by police. Its other tools and services include record management, video storage in the cloud, drones, connected private cameras, analysis tools, virtual reality training, and real-time crime centers.
An image from the Quarter 4 2024 slide deck for investors, which describes different levels of the “Officer Safety Plan” (OSP) product package and highlights how 95% of Axon customers are tied to a subscription plan.
Axon has been adding AI to its repertoire, and it now features a whole “AI Era” bundle plan. One recent offering is Draft One, which connects to Axon’s body-worn cameras (BWCs) and uses AI to generate police reports based on the audio captured in the BWC footage. While use of the tool may start off as a free trial, Axon sees Draft One as another key product for capturing new customers, despite widespread skepticism of the accuracy of the reports, the inability to determine which reports have been drafted using the system, and the liability they could bring to prosecutions.
In 2024, Axon acquired a company called Fusus, a platform that combines the growing stores of data that police departments collect—notifications from gunshot detection and automated license plate reader (ALPR) systems; footage from BWCs, drones, public cameras, and sometimes private cameras; and dispatch information—to create “real-time crime centers.” The company now claims that Fusus is being used by more than 250 different policing agencies.
Fusus claims to bring the power of the real-time crime center to police departments of all sizes, which includes the ability to help police access and use live footage from both public and private cameras through an add-on service that requires a recurring subscription. It also claims to integrate nicely with surveillance tools from other providers. Recently, it has been cutting ties, most notably with Flock Safety, as it starts to envelop some of the options its frenemies had offered.
In the middle of April, Axon announced that it would begin offering fixed ALPR, a key feature of the Flock Safety catalogue, and an AI Assistant, which has been a core offering of Truleo, another Axon competitor.
Flock Safety’s Bundles and FlockOS
Flock Safety is another major police technology company that has expanded its focus from one primary technology to a whole package of equipment and software services.
Flock Safety started with ALPRs. These tools use a camera to read vehicle license plates, collecting the make, model, location, and other details which can be used for what Flock calls “Vehicle Fingerprinting.” The details are stored in a database that sometimes finds a match among a “hot list” provided by police officers, but otherwise just stores and shares data on how, where, and when everyone is driving and parking their vehicles.
Much of what Flock Safety does now comes together in their FlockOS system, which claims to bring together various surveillance feeds and facilitate real-time “situational awareness.”
When you think of Motorola, you may think of phones—but there’s a good chance that you missed the moment in 2011 when the phone side of the company, Motorola Mobility, split off from Motorola Solutions, which is now a big player in police surveillance.
On its website, Motorola Solutions claims that departments are better off using a whole list of equipment from the same ecosystem, boasting the tagline, “Technology that’s exponentially more powerful, together.” Motorola describes this as an “ecosystem of safety and security technologies” in its securities filings. In 2024, the company also reported $2 billion in sales, but unlike Axon, its customer base is not exclusively law enforcement and includes private entities like sports stadiums, schools, and hospitals.
Motorola’s technology includes 911 services, radio, BWCs, in-car cameras, ALPRs, drones, face recognition, crime mapping, and software that supposedly unifies it all. Notably, video can also come with artificial intelligence analysis, in some cases allowing law enforcement to search video and track individuals across cameras.
In January 2019, Motorola Solutions acquired Vigilant Solutions, one of the big players in the ALPR market, as part of its takeover of Vaas International Holdings. Now the company (under the subsidiary DRN Data) claims to have billions of scans saved from police departments and private ALPR cameras around the country. Marketing language for its Vehicle Manager system highlights that “data is overwhelming,” because the amount of data being collected is “a lot.” It’s a similar claim made by other companies: Now that you’ve bought so many surveillance tools to collect so much data, you’re finding that it is too much data, so you now need more surveillance tools to organize and make sense of it.
SoundThinking’s ‘SafetySmart Platform’
SoundThinking began as ShotSpotter, a so-called gunshot detection tool that uses microphones placed around a city to identify and locate sounds of gunshots. As news reports of the tool’s inaccuracy and criticisms have grown, the company has rebranded as SoundThinking, adding to its offerings ALPRs, case management, and weapons detection. The company is now marketing its SafetySmart platform, which claims to integrate different stores of data and apply AI analytics.
In 2024, SoundThinking laid out its whole scheme in its annual report, referring to it as the “cross-sell” component of their sales strategy.
The “cross-sell” component of our strategy is designed to leverage our established relationships and understanding of the customer environs by introducing other capabilities on the SafetySmart platform that can solve other customer challenges. We are in the early stages of the upsell/cross-sell strategy, but it is promising – particularly around bundled sales such as ShotSpotter + ResourceRouter and CaseBuilder +CrimeTracer. Newport News, VA, Rocky Mount, NC, Reno, NV and others have embraced this strategy and recognized the value of utilizing multiple SafetySmart products to manage the entire life cycle of gun crime…. We will seek to drive more of this sales activity as it not only enhances our system’s effectiveness but also deepens our penetration within existing customer relationships and is a proof point that our solutions are essential for creating comprehensive public safety outcomes. Importantly, this strategy also increases the average revenue per customer and makes our customer relationships even stickier.
Many of SoundThinking’s new tools rely on a push toward “data integration” and artificial intelligence. ALPRs can be integrated with ShotSpotter. ShotSpotter can be integrated with the CaseBuilder records management system, and CaseBuilder can be integrated with CrimeTracer. CrimeTracer, once known as COPLINK X, is a platform that SoundThinking describes as a “powerful law enforcement search engine and information platform that enables law enforcement to search data from agencies across the U.S.” EFF tracks this type of tool in the Atlas of Surveillance as a third-party investigative platform: software tools that combine open-source intelligence data, police records, and other data sources, including even those found on the dark web, to generate leads or conduct analyses.
SoundThinking, like a lot of surveillance, can be costly for departments, but the company seems to see the value in fostering its existing police department relationships even if they’re not getting paid right now. In Baton Rouge, budget cuts recently resulted in the elimination of the $400,000 annual contract for ShotSpotter, but the city continues to use it.
“They have agreed to continue that service without accepting any money from us for now, while we look for possible other funding sources. It was a decision that it’s extremely expensive and kind of cost-prohibitive to move the sensors to other parts of the city,” Baton Rouge Police Department Chief Thomas Morse told a local news outlet, WBRZ.
Beware the Bundle
Government surveillance is big business. The companies that provide surveillance and police data tools know that it’s lucrative to cultivate police departments as loyal customers. They’re jockeying for monopolization of the state surveillance market that they’re helping to build. While they may be marketing public safety in their pitches for products, from ALPRs to records management to investigatory analysis to AI everything, these companies are mostly beholden to their shareholders and bottom lines.
The next time you come across BWCs or another piece of tech on your city council’s agenda or police department’s budget, take a closer look to see what other strings and surveillance tools might be attached. You are not just looking at one line item on the sheet—it’s probably an ongoing subscription to a whole package of equipment designed to challenge your privacy, and no sort of discount makes that a price worth paying.
Late last fall, a number of Norfolk, Virginia residents — with the assistance of the Institute for Justice (IJ) — sued the city for blanketing Norfolk with nearly 200 automatic license plate readers (ALPRs) provided by Flock Safety.
Flock Safety made its first inroads with the private market, selling plate readers to gated communities and HOAs so busybodies could keep track of everyone driving in and out of their cul-de-sacs. Having captured that market, Flock moved on, targeting US law enforcement agencies with the promise of cheap ALPRs that could be tied into existing ALPR cameras deployed by private citizens.
It’s pretty much the Ring playbook — aggressive market growth that gives cops cheap buy-in so long as they sign long-term contracts to access images and footage. And it’s the same scheme: the implication that using consumer-oriented products will give cops instant access to a further network of privately-owned cameras.
The Flock that became a swarm scored a small win in court before this lawsuit was filed. A state judge ruled three hits from a private company’s plate reader wasn’t quite enough to trigger a Fourth Amendment violation.
The IJ and its clients disagree. The lawsuit noted the city was now infested with cameras, something the police chief himself said “creates a nice curtain of technology.” “Curtain” is pretty much a blanket when it comes to fabric-based analogies. Police chief Mark Talbot also said “It would be difficult to drive anywhere of any distance without running into a camera.” That certainly sounds like a dragnet.
On top of that, records obtained from the Norfolk PD showed there was no direct or indirect oversight of officers’ access to ALPR data, which not only included plate/location data but also descriptive information about vehicles that investigators could use as search terms, rather than just the plate number itself.
And that’s how the city found itself getting sued by residents represented by the Institute for Justice. Less than five months after filing this suit, a federal judge has ruled this case can move forward.
As the decision [PDF] points out, there’s no denying this carpeting (fabric again!) of the city with cameras creates an inescapable network of government surveillance. And that sort of thing has been addressed by the Supreme Court, as well as courts at the appellate level.
Controlling precedent has deemed certain law enforcement surveillance methods as tantamount to a drag-net, finding that these technologies violate individuals’ subjective and reasonable objective expectations of privacy and therefore constitute a Fourth Amendment search. For example, the Supreme Court held that gathering all of an individual’s cell-site location information little of seven days was a search, for it over a period of as revealed the whole of an individual’s physical movements during that period. Carpenter, 585 U.S. at 311. Similarly, the Fourth Circuit held that aerial surveillance by plane. which captured second by second images of broad swaths of the City of Baltimore for close to 12 hours a day. sufficiently tracked the whole of one’s movements and was therefore a search.
That makes it clear the government can’t simply claim public movements have no expectation of privacy. They likely don’t in the singular, but the aggregate is what’s problematic in terms of constitutionality.
This court says the long-term tracking of people’s movements (via plate/location data from a network of cameras that are, as the police chief stated, inescapable) is the sort of thing the Carpenter decision addressed, even if it dealt with a different form of long-term tracking.
Relying on Carpenter, when this Court accepts Plaintiffs’ well-pled version of the facts and draws all reasonable inferences in their favor, as is required at this stage of the proceedings, the Court concludes that it is plausible that Plaintiffs subjectively believe they have a reasonable expectation of privacy that is being violated because the Flock camera system is creating a drag-net system of surveillance that effectively tracks the whole of Plaintiffs’ physical movements.
Given this, it’s unlikely the government can successfully argue that if Norfolk residents don’t want to be tracked by ALPRs, they can simply choose to walk or use public transportation. That argument hasn’t worked in multiple Supreme Court decisions where the government has claimed that if people don’t want their phone location data accessed without a warrant by investigators, they should just leave their phones at home. In this day and age, going without a phone is about as impractical as going without a car. These are essentials of everyday life, even in cities with marvelous public transportation systems. And I doubt Norfolk places highly on the list of “Best Public Transportation Systems.”
As one plaintiff notes, he can’t even leave his own neighborhood without being photographed by up to four Flock ALPR cameras. That’s ridiculous. And, as this court has ruled at this point, it’s also possibly unconstitutional.
When construed in Plaintiff’s favor, as required at this stage of the case. the complaint alleges facts notably similar to those in Carpenter that the Supreme Court found to clearly violate society’s expectation of privacy: law enforcement secretly monitoring and cataloguing the whole of tens of thousands of individual’s movements over an extended period. In short, the Court finds that considering existing precedent, the well-pled facts plausibly allege a violation of an objectively reasonable expectation of privacy.
Now, of course, this doesn’t mean the plaintiffs have won. But the important thing at this point in the litigation is that the government hasn’t won. The suit has not been dismissed. The fight continues. And, given the tone of this decision, it appears the government will need to bring some new arguments to its defense of its ALPR dragnet because the usual stuff is foreclosed by precedent. With any luck, the Institute for Justice and its clients will, at the very least, generate a warrant requirement for access to ALPR databases — something that would be the first of its kind in this nation.