My latest Locus column is Teaching Computers Shows Us How Little We Understand About Ourselves, an essay about how ideas we think of as simple and well-understood — names, families, fairness in games — turn out to be transcendentally complicated when we try to define them in rule-based terms for computers. I'm especially happy with how this came out.
Systems like Netflix and Amazon Kindle try to encode formal definitions of "family" based on assumptions about where you live — someone is in your immediate family if you share a roof — how you're genetically related — someone is immediate family if you have a close blood-tie — how you're legally related — someone is in your family if the government recognizes your relationship — or how many of you there are — families have no more than X people in them. All of these limitations are materially incorrect in innumerable situations.
What's worse, by encoding errors about the true shape of family in software, companies and their programmers often further victimize the already-victimized — for example, by not recognizing the familial relationship between people who have been separated by war, or people whose marriage is discriminated against by the state on the basis of religion or sexual orientation, or people whose families have been torn apart by violence.
The ambiguity that is inherent in our human lives continues to rub up against our computerized need for rigid categories in ways small and large. Facebook wants to collapse our relationships between one another according to categories that conform more closely to its corporate strategy than reality — there's no way to define your relationship with your boss as "Not a friend, but I have to pretend he is."
Teaching Computers Shows Us How Little We Understand About Ourselves