A New York school district will move forward with its facial recognition pilot program next week, despite an explicit order from the New York State Education Department that it must wait until a standard is finalized for data privacy and security for all state educational agencies.
On Friday, the Lockport school district said it was “confident” that the data collection policy for its facial recognition system was sound enough that it could begin testing it on campuses June 3.
“[State Education Department] representatives previously communicated to the District their recommendation that the System not become operational until the dialogue between the District and SED with regard to student data security and privacy is complete,” the statement, sent by the district’s director of technology, Robert LiPuma, to BuzzFeed News, said. “However, the District’s Initial Implementation Phase of the System (which will commence June 3, 2019 and continue through August 31, 2019) will not include any student data being entered into the System database or generated by the System.”
Two hours after this story published, JP O’Hare, a representative of the New York State Education Department, said in an email, “The district has assured us no facial recognition software will be used next week while it tests other components of the system.” NYSED and Lockport did not respond to multiple follow-up questions about what specific parts of the system would be tested and what would not, though it’s clear certain components of the facial recognition system will be tested next week.
The Lockport pilot comes amid increased scrutiny of facial recognition’s efficacy across the US and growing concerns by civil rights activists that the tech may serve to further entrench societal biases. Earlier this month, San Francisco banned police from using facial recognition; now, similar bills that would do the same are emerging in other cities. Amazon has endured persistent pressure — including from its own shareholders — for its aggressive marketing of its facial Rekognition system to law enforcement agencies. And Rep. Alexandria Ocasio-Cortez has expressed concern in a congressional hearing on the technology last week that facial recognition could be used as a form of social control.
At the same time, reports and studies of facial recognition’s inaccuracies and mistakes — especially on women and people of color — continue to emerge.
Lockport implied that its facial recognition system should not be a privacy concern because it “does not compile information on and track the movements of all District students, staff and visitors.” Instead, the statement explained, the system is “limited to identifying whether an individual whose photograph has been entered into the System database is on District property (i.e., is visible on one of the District’s security cameras).” But it also said the individuals who may be entered into the database included those who are prohibited from being on district property, “such as suspended students or staff.”
Meanwhile, previous reporting from BuzzFeed News has shown that in order to effectively flag the faces of “persons of interest,” facial recognition systems must also disregard the faces of people who are not of interest. In other words, it analyzes them too.
“One problem is that it doesn’t matter who is on the list when it comes to the error rates of these systems,” said Clare Garvie, an associate at Georgetown Law’s Center on Privacy and Technology, who has studied facial recognition technology extensively. “In the case of a false match — say a parent is apprehended [by law enforcement] because they get falsely misidentified as a sex offender, what happens?” Lockport has not released any data on the accuracy of its facial recognition system.
The concern is also not just who can be enrolled, Garvie said, but what the redress policy is if an individual is enrolled by accident, or if they think they should not be on a persons of interest list. “What is the follow-up mechanism?” she asked. “What are the checks in place?”
Johanna Miller, director of the Education Policy Center at New York Civil Liberties Union, which has investigated Lockport’s rollout of facial recognition technology, called the system “a colossal waste of taxpayer dollars and energy that should be spent educating kids.” To date, Lockport has spent $1.4 million of the $4.2 million it was awarded through the New York Smart Schools Bond Act to get its system up and running.
“Lockport has demonstrated a reckless disregard for the privacy of their students and the larger school community for far too long,” Miller said. “We are calling on the state legislature to adopt a moratorium on face surveillance to protect every student in the state from the reckless use of harmful technology.” ●