Yahoo has released a machine-learning model called open_nsfw that is designed to distinguish not-safe-for-work images from worksafe ones. By tweaking the model and combining it with places-CNN, MIT’s scene-recognition model, Gabriel Goh created a bunch of machine-generated scenes that score high for both models — things that aren’t porn, but look porny.
Goh’s results are impressive: there’s a set of beach scenes, canyon scenes, concert scenes, gallery scenes, coral reef scenes, desert scenes, museum scenes, tower scenes and volcano scenes that all make a porn-recognition algorithm fire on all cylinders and also make the scene-recognition model twitch.
This program produces the most remarkable results. The images generated range from the garishly explicit to the subtle. But the subtle images are the most fascinating as to my surprise they are only seemingly innocent. These are not adversarial examples per-say. The NSFW elements are all present, just hidden in plain sight. Once you see the true nature of these images, something clicks and it becomes impossible to unsee. I’ve picked a few of my favorite results for show here.
Image Synthesis from Yahoo’s open_nsfw
[Gabrielle Goh/Gitlab]
(via Metafilter)