Researchers amplify variations in video, making the invisible visible

22 Jun 2012

1

At this summer's Siggraph - the premier computer-graphics conference - researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) will present new software that amplifies variations in successive frames of video that are imperceptible to the naked eye. So, for instance, the software makes it possible to actually "see" someone's pulse, as the skin reddens and pales with the flow of blood, and it can exaggerate tiny motions, making visible the vibrations of individual guitar strings or the breathing of a swaddled infant in a neonatal intensive care unit.
 
The system is somewhat akin to the equaliser in a stereo sound system, which boosts some frequencies and cuts others, except that the pertinent frequency is the frequency of colour changes in a sequence of video frames, not the frequency of an audio signal. The prototype of the software allows the user to specify the frequency range of interest and the degree of amplification. The software works in real time and displays both the original video and the altered version of the video, with changes magnified.
 
Although the technique lends itself most naturally to phenomena that recur at regular intervals - such as the beating of a heart, the movement of a vibrating string or the inflation of the lungs - if the range of frequencies is wide enough, the system can amplify changes that occur only once. So, for instance, it could be used to compare different images of the same scene, allowing the user to easily pick out changes that might otherwise go unnoticed. In one set of experiments, the system was able to dramatically amplify the movement of shadows in a street scene photographed only twice, at an interval of about 15 seconds.
 
The MIT researchers - graduate student Michael Rubinstein, recent alumni Hao-Yu Wu '12, MNG '12 and Eugene Shih SM '01, PhD '10, and professors William Freeman, Fredo Durand and John Guttag - intended the system to amplify color changes, but in their initial experiments, they found that it amplified motion as well. "We started from amplifying colour, and we noticed that we'd get this nice effect, that motion is also amplified," Rubinstein says. "So we went back, figured out exactly why that happens, studied it well, and saw how we can incorporate that to do better motion amplification."
 
Using the system to amplify motion rather than colour requires a different kind of filtration, and it works well only if the motions are relatively small. But of course, those are exactly the motions whose amplification would be of interest.
 
Rubinstein envisions that, among other applications, the system could be used for "contactless monitoring" of hospital patients' vital signs. Boosting one set of frequencies would allow measurement of pulse rates, via subtle changes in skin coloration; boosting another set of frequencies would allow monitoring of breathing.

The approach could be particularly useful with infants who are born prematurely or otherwise require early medical attention. "Their bodies are so fragile, you want to attach as few sensors as possible," Rubinstein says.
 
Similarly, Rubinstein says, the system could be used to augment video baby monitors for the home, so that the respiration of sleeping infants would be clearly visible. A father himself, Rubinstein says that he and his wife equipped their daughter's crib with commercial pressure sensors intended to gauge motion and reassure anxious parents that their children are still breathing. "Those are kind of expensive," Rubinstein says, "and some people really complain about getting false positives with them. So I can really see how this type of technique will be able to work better."
 
In their paper, the researchers describe experiments in which they began investigating both of these applications. But since they've begun giving talks on the work, Rubinstein says, colleagues have proposed a range of other possible uses, from laparoscopic imaging of internal organs, to long-range-surveillance systems that magnify subtle motions, to contactless lie detection based on pulse rate.
 
"It's a fantastic result," says Maneesh Agrawala, an associate professor in the electrical engineering and computer science department at the University of California at Berkeley, and director of the department's Visualisation Lab.

Agrawala points out that Freeman and Durand were part of a team of MIT researchers who made a splash at the 2005 Siggraph with a paper on motion magnification in video. "This approach is both simpler and allows you to see some things that you couldn't see with that old approach," Agrawala says. "The simplicity of the approach makes it something that has the possibility for application in a number of places. I think we'll see a lot of people implementing it because it's fairly straightforward."

Business History Videos

History of hovercraft Part 3 | Industry study | Business History

History of hovercraft Part 3...

Today I shall talk a bit more about the military plans for ...

By Kiron Kasbekar | Presenter: Kiron Kasbekar

History of hovercraft Part 2 | Industry study | Business History

History of hovercraft Part 2...

In this episode of our history of hovercraft, we shall exam...

By Kiron Kasbekar | Presenter: Kiron Kasbekar

History of Hovercraft Part 1 | Industry study | Business History

History of Hovercraft Part 1...

If you’ve been a James Bond movie fan, you may recall seein...

By Kiron Kasbekar | Presenter: Kiron Kasbekar

History of Trams in India | Industry study | Business History

History of Trams in India | ...

The video I am presenting to you is based on a script writt...

By Aniket Gupta | Presenter: Sheetal Gaikwad

view more
View details about the software product Informachine News Trackers