Legendary DNS hacker Dan Kaminsky has a new, out-of-left-field project to mitigate color blindness with augmented reality software for mobile phones. DanKam is a mobile app that you calibrate so it knows the specifics of your color blindness (I can’t see a lot of greens), and then it automatically color-corrects the world as seen through the phone’s lens to compensate for your deficit.
In terms of use cases — matching clothes, correctly parsing status lights on gadgets, and managing parking structures are all possibilities. In the long run, helping pilots and truckers and even SCADA engineers might be nice. There’s a lot of systems with warning lights, and they aren’t always obvious to the color blind.
Really though, not being color blind I really can’t imagine how this technology will be used. I’m pretty sure it won’t be used to break the Internet though, and for once, that’s fine by me.
Ultimately, why do I think DanKam is working? The basic theory runs as so:
1. The visual system is trying to assign one of a small number of hues to every surface
2. Color blindness, as a shift from the green receptor towards red, is confusing this assignment
3. It is possible to emit a “cleaner signal”, such that even colorblind viewers can see colors, and the differences between colors, accurately.
4. It has nothing to do with DNS (I kid! I kid! But no really. Nothing.)
DanKam: Augmented Reality For Color Blindness
(Thanks, Dan!)