Identity theft's newest target: your face

A lot of companies struggle with bias in the workplace, but for many big tech companies, the problem is a bit more extreme. Why, because it's not just the human beings that are racist, a lot of their algorithms are biased too. This is the problem that Google reportedly was trying to solve when they got themselves into their latest privacy scandal, tricking black and brown “volunteers” to submit to 3D face scans.

According to the New York Daily News, Google contractors told these biometric test subjects that they were just going to play a “selfie game”, and they'd even get paid for it. Paid all of five bucks so that Google could use their identity to build more accurate facial recognition, holding onto their facial model forever. And it gets even worse, not only were contractors reportedly lying, but they also targeted the homeless and others in dire financial straits, hoping they were less likely to complain publicly.

The whole reason Google did this to begin with is because facial recognition is more than a little bit biased. We think of software as objective, but the truth is that artificial intelligence tools like facial recognition depend on thousands of subjective choices made by human beings who build them. Those human choices can automate discrimination, as we see facial recognition products on the market that are dozens or even hundreds of times less accurate for women, trans, people of color, children, and other groups. In short, these tools often only work well for middle-aged white guys.

Biometric surveillance like facial recognition, iris scans, and gait detection, isn't just discriminatory, it's are deeply invasive. These two concerns are why the City Council is contemplating a response, holding a hearing today on bills that would push back on the growing surveillance net in our stores, schools, and, increasingly, even our homes.

One bill from Council Member Brad Lander would stop building owners from forcing tenants to use facial recognition or other surveillance systems as an alternative to a physical key. This isn't some theoretical concern. We already see New Yorkers fighting facial recognition in their own buildings. The systems are not only sold on the questionable promise that they could replace your front door key, but they also can log every movement you take in building hallways. Landlords claim that this is for tenants' own good, that they will benefit from the biometric dragnet, but the tenants don't believe them.

Another bill from Council Member Ritchie Torres requires stores to tell us when they're using biometric surveillance. Giant retailers have already invested millions in monitoring our every movement, using our phones as a virtual tracking collar. Stop and look at a product, they know. Buying something with cash to keep it private? If you have your phone on you, they may know. But while phone tracking is bad, but biometric surveillance is even worse, making it easier for companies to analyze and log every step we take and every shopping choice we make.

Lastly, a measure from Council Member Donovan Richards would require building owners to tell the city whenever they use biometric tracking, and ordering the city to create a database of these systems. It would help the public get a better sense of the scale of these tracking tools that are already found in nearly every corner of our city.

The bills are a helpful step, but they sadly don't go far enough. At a time when cities like San Francisco and Somerville are banning facial recognition completely, the council continues to push half measures. It's not enough to simply tell the public that we're being surveilled, it's time for elected officials to go further and actually stop the surveillance. None of these bills would have prevented the sort of campaign we saw from Google, and they won't stop the next firm to use these new technologies to target our most vulnerable communities.

Even more urgently, the city needs to look at its own track record; physician, heal thyself. If these bills pass, our own agencies will fall short of the standard being proposed for the rest of the public. The NYPD and other agencies continue to use biometric surveillance like facial recognition and DNA dragnets without any notice to the public. It should be just the opposite; we should have higher standards for government than for the local bodega. It's long past time that New York City goes much further to credibly protect the public's privacy.


Cahn is the executive director of The Surveillance Technology Oversight Project at the Urban Justice Center, a New York-based civil rights and privacy organization.