Remote control, with a wave of a hand

27 Sep 2011

Playing a computer game once meant sitting on the couch and pushing buttons on a controller, but those buttons have been disappearing of late, replaced by human gestures that guide the action.
  
Soon gestures may be controlling more than just games. Scientists at Microsoft Research and the University of Washington have come up with a new system that uses the human body as an antenna. The technology could one day be used to turn on lights, buy a ticket at a train station kiosk, or interact with a world of other computer applications. And no elaborate instruments would be required.
 
''You could walk up to a ticket-purchasing machine, stand in front and make a gesture to be able to buy your ticket - or set the kind of gas you want at the gas station,'' said Desney Tan, a senior researcher at Microsoft Research and one of the creators of the technology. The system, demonstrated so far only in experiments, is ''a fascinating step forward,'' said Joseph A. Paradiso, an associate professor of media arts and sciences at the Massachusetts Institute of Technology and co-director of the Things That Think Consortium.
 
There is no reason to fear that the new technology will affect people's health, he said; it merely exploits electromagnetic fields that are already in the air. ''Suddenly someone takes advantage of it and opens up an example that is potentially useful,'' he said of the new gesture technology.
 
The innovation is potentially inexpensive, as it requires no handheld wireless wand, as the Nintendo Wii does, or the instrumentation of Microsoft's Kinect, which uses infrared light and cameras to track motion.
 
Instead, the technology uses something that is always with us, unless we live in the wilderness: ambient electromagnetic radiation emitted as a matter of course by the wiring in households, by the power lines above homes, and by those gas pumps at the service station.
 
The human body produces a small signal as it interacts with this ambient electrical field. The new system employs algorithms to interpret and harness that interaction.
 
In initial tests, the technology determined people's locations and gestures from the way their bodies interacted with the electrical field, said one of its inventors, Shwetak N Patel, an assistant professor at the University of Washington.
 
Matt Reynolds, an assistant professor of electrical and computer engineering at Duke, who collaborates with Dr. Patel, says it has long been known that people function as antennas as they move near power lines - for example, those within the walls of a home. ''What's new,'' he said, ''is leveraging those signals as useful data that can be the basis for an interface for a computer system.''
 
Dr. Patel has had a longstanding interest in delving into signals in the home and finding new uses for them. He helped to found a company called Zensi, which created an energy monitoring device that can be plugged into any outlet in a home to figure out which appliances are drawing power. (The company was sold last year.)
 
He is also working on a system to monitor home water use. By detecting minute changes in pressure at a spigot, it infers how much water the toilet or dishwasher is consuming.
 
Practical applications of gesture technology will take time to develop, Dr. Tan of Microsoft cautioned. One of those applications may be in homes, where a wave of the hand might control lighting, security systems, air-conditioners or televisions. Of course, designers must take care that gestures don't accidentally set off a device, he said. For example, it could be commanded to start only with an unusual gesture - perhaps drawing a circle in the air, or touching a certain number of fingers on the wall. Once users have done this, the system knows the gesture is intended as a command.
 
Such home-automation devices would have to be calibrated individually for each household, and recalibrated if people moved to another home, as the homes would have different wiring. But Robert Jacob, a professor of computer science at Tufts University, said that such calibration would be a relatively minor chore for machine learning.
 
''A computer can be quickly trained to do that,'' he said.