Boing Boing Staging

The answer to the Clearview AI scandal is better privacy laws, not anti-scraping laws

Clearview AI (previously) is a grifty facial recognition company that sells untested, secretive tools to police departments, claiming that they can identify people from security camera footage by matching the pictures those scraped from big social media sites.


It turns out — unsurprisingly — that Clearview’s marketing copy is not a reliable source of impartial evidence about the quality of its products, and neither are the testimonials of the cops who urged their bosses to buy Clearview products.


Nevertheless, Clearview is a creepy, grifty, privacy-invading toolsmith serving authoritarians, getting rich by covertly supplying its overhyped tools, and, unsurprisingly, lots of people (including me) want structural changes to make Clearview cut it out and prevent future Clearviews from emerging.

However, the remedy that’s favored by the Big Tech monopolists that Clearview used for raw materials is to ban scraping, something that Big Tech has been aggressively seeking to criminalize. The problems with this is that scraping bans represent the best hope monopolists have for maintaining their monopolies: if it’s against the law to extract your own data from a Facebook walled garden, then Facebook doesn’t need to fear that a competitor will create a tool that will let you stay in touch with your Facebook friends without using Facebook (by logging into Facebook as you, scraping your waiting messages, and letting you reply without touching Facebook yourself).


The Computer Fraud and Abuse Act — a federal hacking statute from 1986! — is the preferred tool for blocking scraping (it’s the law that was used to threaten Aaron Swartz with a long prison sentence after he wrote a tool that mass-downloaded scientific articles he was entitled to read from MIT’s network). It’s a dumpster fire of a law, passed in a moral panic engendered by the 1984 Matthew Broderick movie Wargames (I’m not making this up!), which has grown more dangerous and less relevant with every year since.


If we want to protect privacy, we should pass a federal privacy law — something Big Tech has fought tooth and nail — that regulates what you do with scraped data, without criminalizing an activity that is key to competition, user empowerment, academic and security research.


The CFFA is one of few options available to companies who want to stop scrapers, which is part of the problem. “It’s a 1986, pre-internet statute,” says WIlliams. “If that’s the best we can do to protect our privacy with these very complicated, very modern problems, then I think we’re screwed.”

Civil liberties groups and technology companies both have been calling for a federal law that would establish Americans’ right to privacy in the digital era. Clearview, and companies like it, make the matter that much more urgent. “We need a comprehensive privacy statute that covers biometric data,” says Williams.

Right now, there’s only a patchwork of state regulations that potentially provide those kinds of protections. The California Consumer Privacy Act, which went into effect this month, gives state residents the right to ask companies like Clearview to delete data it collects about them. Other regulations, like the Illinois Biometric Information Privacy Act, require corporations to obtain consent before collecting biometric data, including faces. A class action lawsuit filed earlier this week accuses Clearview of violating that law. Texas and Washington have similar regulations on the books, but don’t allow for private lawsuits; California’s law also doesn’t allow for private right of action.

Scraping the Web Is a Powerful Tool. Clearview AI Abused It [Louise Matsakis/Wired]

(Image: MGM/UA)

Exit mobile version