Face masks force a facial recognition reckoning

Face masks force a facial recognition reckoning

In a landmark ruling issued Tuesday, the UK Court of Appeals said a British Police force violated human rights by unlawfully using facial recognition technology. Already mired in controversy with civil rights and privacy violations stacking up, facial recognition is facing its biggest challenge yet: face masks. Useless in the age of COVID-19, facility managers are turning off divisive facial recognition cameras to reexamine their use. With lawsuits piling up, court rulings coming down, and tech giants pulling out, it’s time for the PropTech industry to pump the brakes on facial recognition technology.

In 2018, researchers at MIT and Microsoft sparked controversy when they released a report showing that facial recognition algorithm errors were 49 times more likely for dark-skinned women than white men. Subsequent studies have found other forms of bias. Coupled with privacy concerns surrounding the largely unregulated technology, the $3.2 billion facial recognition industry is in rough water as the controversial cameras are thwarted by masks during a global pandemic.

Recognizing The Issues

Rite Aid, which added facial recognition technology to 200 stores across the United States, announced that the cameras will be turned off, after a Reuters investigation found their China-linked technology was deployed in largely lower-income, non-white neighborhoods. Macy’s is facing a class action lawsuit claiming the retailers’ Clearview AI facial recognition software is an invasion of privacy.

“This decision was in part based on a larger industry conversation,” Rite Aid told Reuters in a statement, adding that “other large technology companies seem to be scaling back or rethinking their efforts around facial recognition given increasing uncertainty around the technology’s utility.”

Read more

Rite Aid isn’t the only commercial operation to experiment with the new technology. In the world of retail, Target and Walmart experimented with facial recognition to combat shoplifting and fraud. Marriott tested the technology to recognize and greet guests. KFC and McDonald’s have used facial recognition to test new payment methods and assess the quality of customer service. Some New York landlords have installed the technology to recognize tenants and outsiders. Offices have invested in the technology for keyless entry and other forms of automated access. Wary of the emerging technology’s efficacy and lack of federal regulation, few experiments have gone mainstream. Still, half of all U.S. adults are already included in police facial-recognition databases.

Now, the technology is becoming functionally obsolete, unable to see faces beneath our masks. That’s why earlier this year Apple made it easier for iPhone users to unlock their phone without the company’s facial recognition system. That hasn’t stopped the industry in its tracks. New uses for the technology aimed at fighting the pandemic are beginning to proliferate. In Pasadena, business owners have banded together for the nation’s first hand-free facial recognition payment, offering contact free purchasing. The New York Mets and Los Angeles Football Club are testing the technology for fan admittance for contactless entry. The new uses still need to see the whole face, requiring users to remove their masks.

Fighting Against Facemasks

Vornado, landlord of roughly 19 million square feet of office space in Manhattan, is moving forward with rolling out more facial recognition to its properties to help with seamless, touchless entry, according to Business Insider. Occupants will have the chance to opt out. Office workers may be less concerned with privacy. Opinions on surveillance and tracking have changed in light of the pandemic, researchers have found. Soon, Facebook employees at Voranado’s Farley Building, where the company just inked a 730,000 square foot lease, will be deciding whether or not to opt out. During tests of the technology, only about 40 percent of tenants opted in, raising questions of efficacy.

“Virtually everyone who has used the technology has liked it,” Vornado vice chairman David Greenbaum told Business Insider. “I never had a preconceived notion of what the adoption rate would be, but as our tenants see others using it, they are becoming increasingly comfortable with the technology.”

Sign Up for the undefined Newsletter

Covering your face makes face masks the best way to protect yourself from the coronavirus and privacy violations. As face mask use becomes ubiquitous across the country, facial recognition algorithms are breaking down. Error rates in popular facial recognition technologies are spiking between five and 50 percent, according to a study by the U.S. National Institute of Standards and Technology (NIST). The algorithms weren’t that good to begin with. That’s welcome news to privacy and civil rights advocates. For law enforcement, it’s the latest cause for concern.

In May the U.S. Department of Homeland Security sent an internal memo concerned about the “potential impacts that widespread use of protective masks could have on security operations that incorporate face recognition systems,” according to reporting by The Intercept. The memo warns of bad actors exploiting public health guidelines to hide their identity in an attempt to avoid consequences for criminality. With civil rights protests in full swing across the United States, the memo amplifies Orweillian concerns about the technology’s use and bias against black and minority communities. Facial technology has already been used to arrest protestors in Baltimore, linking faces from the protests to social media profiles.

Backlash To Bias

Lawmakers at the local, state, and federal level are moving forward with legislation to ban the technology.

Study after study shows commercial facial recognition tech is biased and inaccurate. One study from the National Institute of Standards and Technology found the systems falesy identified Black and Asian faces ten times to 100 times more often than white faces. An MIT study had similar findings, showing some facial recognition from large tech companies had much lower accuracy identifying female and darker-skinned faces than it did white males faces. The American Civil Liberties Union found Amazon’s Rekognition software misidentified 28 members of Congress as criminals.

“While some biometric researchers and vendors have attempted to claim algorithmic bias is not an issue or has been overcome, this study provides a comprehensive rebuttal,” Joy Buolamwini, a researcher at the MIT Media Lab who led one of the facial studies, said in an email to the New York Times. “We must safeguard the public interest and halt the proliferation of face surveillance.”

IBM, once a leading pioneer in developing facial recognition, is pulling the plug on its facial recognition tech partnerships with law enforcement. Big Blue will no longer provide facial recognition technology to police departments for mass surveillance and racial profiling, Arvind Krishna, IBM’s chief executive, wrote in a letter to Congress.

“Artificial Intelligence is a powerful tool that can help law enforcement keep citizens safe,” Krishna wrote. “But vendors and users of Al systems have a shared responsibility to ensure that Al is tested for bias, particularity when used in law enforcement, and that such bias testing is audited and reported.”

Microsoft and Amazon have taken a similar tack recently. In June, Microsoft President Brad Smith announced the company would not sell facial recognition to the police until congress passed federal legislation regulating the emerging technology. Amazon announced a one-year ban on police use of its Rekognition software. Those decisions are made easier by the fact facial recognition is no longer possible on a population covering their faces.

“Their products have the alarming potential to infringe on Americans’ privacy rights in ways that we would have thought unimaginable not long ago,” Sen. Edward Markey (D-MA) said after Amazon’s announcement. “Pressing pause on the use of this technology by law enforcement is a positive step, but what Amazon should really do is a complete about-face and get out of the business of dangerous surveillance altogether.”

The coronavirus has forced the issue. Ethical concerns relating to facial recognition have been quickly eclipsed by efficacy. In a world full of masks, facial recognition cameras are blind. Until we’re past the pandemic, it’s hard to see the point of fighting for or defending the technology.

Three of the largest players in facial recognition backing away from the technology should give pause to commercial owners and managers looking to implement the technology in their office buildings or retail stores. The technology creates more problems than it solves. Right now, it can’t solve any. [Propmodo]