A joint Disney Research and CMU team have produced a demo showing gesture controls on a variety of everyday, non-computer objects. The system, called Touché, uses capacitive coupling to infer things about what your hands are doing. It can determine which utensil you’re eating your food with, or how you’re grasping a doorknob, or even whether you’re touching one finger to another or clasping your hands together. It’s a pretty exciting demo, and the user interface possibilities are certainly provocative. Here’s some commentary from Wired UK’s Mark Brown:
Some of the proof-of-concept applications in the lab include a smart doorknob that knows whether it has been grasped, touched, or pinched; a chair that dims the lights when you recline into it; a table that knows if you’re resting one hand, two hands, or your elbows on it; and a tablet that can be pinched from back to front to open an on-screen menu.
The technology can also be shoved in wristbands, so you can make sign-language-style gestures to control the phone in your pocket—two fingers on your palm to change a song, say, or a clap to stop the music. It can also go in liquids, to detect when fingers and hands are submerged in water.
“In our laboratory experiments, Touché demonstrated recognition rates approaching 100 percent,” claims Ivan Poupyrev, senior research scientist at Disney Research in Pittsburgh. “That suggests it could immediately be used to create new and exciting ways for people to interact with objects and the world at large.”
Disney researchers put gesture recognition in door knobs, chairs, fish tanks