Boing Boing Staging

Why no one has made a tool to turn off Facebook oversharing

The debate over whether Cambridge Analytica’s harvesting of tens of millions of Facebook profiles was a “breach” turns on the question of whether Cambridge Analytica did anything wrong, by Facebook’s own policies.


By default, Facebook shares your friends’ data with the apps you interact with through its “platform,” and turning that feature off requires an intricate, needlessly, deliberately complex dance with Facebook’s bewildering privacy dashboard.


Privacy advocates are publishing recipes you can follow to prevent this oversharing, but this raises the question: why not just automate the process, making a tool that autopilots your browser to undo all of Facebook’s terrible defaults? This isn’t a complex tool, and it’s exactly the kind of thing that software is great for: automating a complex, fiddly task with a robot that does it perfectly every time.


The answer is that it’s become legally terrifying to make tools like that. We can still make ad-blockers (for now), but making a tool that allows your users to modify how they interact with their technology is increasingly fraught, whether that’s a tool to force your printer to accept third-party ink, a tool to force your phone to accept third-party apps, or a tool that automates privacy-maximizing changes to your Facebook settings.


Larry Lessig taught us that our world is regulated by four forces: law (what’s legal), markets (what’s profitable), norms (what’s moral) and code (what’s technologically possible). Companies like Cambridge Analytica and Facebook get to deploy all four forces to push us to behave in ways that benefit them, but increasingly code is off the table when it comes to pushing back.


We need technologists to thoughtfully communicate technical nuance to lawmakers; to run businesses that help people master their technology; to passionately make the case for better technology design.

But we also need our technologists to retain the power to affect millions of lives for the better. Skilled toolsmiths can automate the process of suing Equifax, filing for housing aid after you’re evicted, fighting a parking ticket or forcing an airline to give you a refund if your ticket’s price drops after you buy it (and that’s all just one programmer, and he hasn’t even graduated yet!).

When we talk about “walled gardens,” we focus on the obvious harms: an App Store makes one company the judge, jury and executioner of whose programs you can run on your computer; apps can’t be linked into and disappear from our references; platforms get to spy on you when you use them; opaque algorithms decide what you hear (and thus who gets to be heard).

But more profoundly, the past decade’s march to walled gardens has limited what we can do about all these things. We still have ad-blockers (but not for “premium video” anymore, because writing an ad-blocker that bypasses DRM is a potential felony), but we can’t avail ourselves of tools to auto-configure our privacy dashboards, or snoop on our media players to see if they’re snooping on us, or any of a thousand other useful and cunning improvements over our technologically mediated lives.


Yet Another Lesson from the Cambridge Analytica Fiasco: Remove the Barriers to User Privacy Control
[Cory Doctorow/EFF]

Exit mobile version