Boing Boing Staging

UK cops' porn-spotting software can't tell the difference between sand and skin

British cops got software to help them identify children in abuse images, but it’s having trouble telling the difference between skin and sand.

…the department is working with “Silicon Valley providers” to help train the AI to successfully scan for images of child abuse. But as we’ve seen, even the most powerful tech companies can’t seem to deploy algorithms that don’t fuck up every once in awhile. They have promoted dangerous misinformation, abetted racism, and accidentally censored bisexual content. And when Gizmodo recently tested an app intended to automatically identify explicit images in your camera roll, it flagged a photo of a dog, a donut, and a fully-clothed Grace Kelly.

What are the odds that Londons’ finest paid fabulous sums of money for what amounts to a dumb imagemagick script that averages and compares color samples from various points on each image, and are now paying fabulous sums more to have some rinkydink machine learning grafted on.

Exit mobile version