Officials at the Lockport, New York, school district have purchased face recognition technology as part of a purported effort to prevent school shootings. Starting in September, all 10 of Lockport District’s school buildings, just north of Buffalo, will be outfitted with a surveillance system that can identify faces and objects. The software, known as Aegis, was developed by SN Technologies Corp., a Canadian biometrics firm that specifically advertises to schools. It can be used to alert officials to whenever sex offenders, suspended students, fired employees, suspected gang members, or anyone else placed on a school’s “blacklist” enters the premises. Aegis also sends alerts any time one of the “top 10” most popular guns used in school shootings appears in view of a camera.

The district is spending most of its recent $4 million state “Smart School” grant on these and other enhancements to its security systems, including bullet-proof greeter windows and a mass notification system, according to the Niagra Gazette. “We always have to be on our guard. We can’t let our guard down,” Lockport Superintendent Michelle T. Bradley told the Buffalo News. “For the Board of Education and the Lockport City School District, this is the No. 1 priority: school security.”

Yet given the nature of gun violence at schools, Lockport’s purchase of surveillance technology appears inefficient and expensive. All of the major school shootings in the last five years in the U.S. have been carried out by current students or alumnae of the school in question. “These are students for whom the school wouldn’t have a reason to have their face entered into the face recognition system’s blacklist,” explained Rachel Levinson-Waldman, a security and policing expert at the Brennan Center for Justice.

The object recognition system seems similarly pointless, she said. Most shooters don’t brandish their guns before opening fire; and by the time they do, an object-detection algorithm that could specify the exact type of weapon they’re firing would not be of much use. As Jim Shultz, a Lockport parent, pointed out to the Buffalo News, the technology would give a school, at best, only a few extra seconds in response time to a shooting. What’s more, most shootings typically end within seconds — so that face or weapon recognition would provide about as much real-time value as a 911 call. Lockport schools, Shultz added, have already instituted preventative — albeit less flashy — measures, such as keeping doors locked and requiring visitors to check in.

Because face recognition appears uniquely ill-suited to respond directly to school shootings — which are themselves statistically rare events — privacy experts fear that the primary function of the technology will be to expand the surveillance and criminalization of adolescents. “Whether it was intended to be this way or not, Lockport’s technology is effectively going to be a surveillance system and not a safety system,” Levinson-Waldman said.

Learning While Black

Lockport’s system will store data for up to 60 days. Students will not be automatically entered into its database, but administrators will be able to use the system to “follow” those who commit infractions. “If we had a student who committed some type of offense against the code of conduct, we can follow that student throughout the day to see maybe who they interacted with,” one school official said.

Levinson-Waldman says that such surveillance powers are likely to be wielded disproportionately against students of color, who already face disciplinary bias at school. Several studies have shown that black and Latino children are routinely viewed as more dangerous than their white peers, regardless of their behavior. Black students are not more likely to misbehave than white students, yet they are more likely to be suspended, receive corporal punishment, or have a school-related arrest, according to a Government Accountability Office report released in April.

“This is going to exacerbate the racial disparities you already see, whether it’s about monitoring or enforcement,” said John Cusick, a fellow at the NAACP Legal Defense and Educational Fund. Students who are already subject to police surveillance in their neighborhoods, he added, will now have to face the same environment at their school. Research by the NAACP LDF and others suggests that putting police officers in schools increases the number of students who end up incarcerated for harmless incidents. “Face recognition might curtail how students interact. They might be afraid of being linked to other students or engaging in adolescent behavior,” Cusick continued. “It has the ability to criminalize friendships.”

Civil rights attorneys raised questions about whether biometric data collected at schools like Lockport might fuel not only the school-to-prison pipeline, but also deportations. Schools might share their biometric data with law enforcement and, in turn, with Immigration and Customs Enforcement — or vice versa. In California and New Jersey, ICE has arrested undocumented parents as they dropped their kids off at school, and ICE was recently granted access to at least one automatic license plate reader database. Levinson-Waldman wonders which faces will be put into the system to flag a response and whether schools will connect to a law enforcement database. “Will they give ICE information about that parent’s movements, intentionally or not?” she asked.

Royal Palms Middle School eighth-grade students Nicole Johnson, left, Morgan Johnson, center, and Devon Rood talk near one of the schools two facial recognition cameras in Phoenix, Ariz., on Dec. 16, 2003. The school is the first in the country to use the system that was installed to identify sex offenders that may enter the school’s building.

Photo: Jeff Topping/Getty Images

New Directions in Education

Lockport is not the first school district to deploy the kind of advanced surveillance technology typically used by prisons, airports, and border checkpoints. Face recognition has already been installed in high schools in Magnolia, Arkansas, and St. Louis. A school district in New Mexico deploys shot-spotter technology, which notifies police at the first sound of gunfire, while another district in New York has acquired automatic license plate readers. In Iowa and Texas, among others, school districts have equipped law enforcement and school resource officers with body-worn cameras. “This is part of a bigger trend of school districts, as well as police departments, touting cost-benefit savings in the name of expanding surveillance,” Cusick noted.

Privacy advocates are also concerned about the kind of lessons that early, invasive surveillance teaches students about the society we live in. “Communities and schools need to think hard about what type [of] message they are sending to our kids when they monitor them in school like they were prisoners in a detention facility,” Rita Sklar, executive director of the American Civil Liberties Union of Arkansas, said in a statement. “We urge the Magnolia School Board, and all Arkansas school districts, to avoid these expensive, harmful gimmicks and consider more sensible approaches to keeping schools safe.”

Of course, these surveillance systems do not come cheaply — and it’s possible that companies will make money from the data they collect from students. “I would be shocked if a vendor was not going to use video as a way to train and ‘improve’ the algorithm,” said Philip Hagen, a technologist who went to school in Lockport. “That’s inherent in any machine learning operation. From an ethical standpoint, I would definitely be concerned about that.” It is not evident who owns the data, how long it is retained, and whether parents have a right to opt-out of the system.

The technology’s cost becomes particularly troubling when one considers that it may not even work. Some facial recognition software performs poorly. U.K. police made headlines recently when documents revealed that one system in Wales turned up false matches 92 percent of the time. In particular, the technology is more prone to err when dealing with black faces, as several studies have shown. In one of the most recent, MIT Media Lab researcher Joy Buolamwini showed that darker-skinned women were up to 35 percent more likely to be misrecognized by some face recognition algorithms. Other studies have shown that the technology is less effective on children. Hagen and Levinson-Waldman have asked whether the technology will be independently audited for accuracy.

By investing in this technology, as opposed to other resources, Levinson-Waldman says that schools are emphasizing policing at the expense of teaching. As the Buffalo News reported, the Lockport District may face a shortfall of nearly $1 million in the 2018-2019 school year. If aid does not materialize, it is possible that the district will “cut transportation and sports programs, reduce kindergarten to half days and close elementary school libraries.”

“In a time when we cannot afford to pay our teachers a decent wage,” said Andrew Ferguson, a policing and civil rights expert, “I cannot fathom any school district paying money for this type of security theater.”