Scientists analyse plants’ health in seconds with Google Glass app

12 Feb 2015

Scientists from California NanoSystems Institute at the University of California at Los Angeles (UCLA)   have developed a Google Glass app that, when paired with a hand-held device, enables the wearer to quickly analyse the health of a plant without damaging it.

The app analyses the concentration of chlorophyll - the substance in plants responsible for converting sunlight into energy. Reduced chlorophyll production in plants can indicate degradation of water, soil or air quality.

One current method for measuring chlorophyll concentration requires removing some of the plant's leaves, dissolving them in a chemical solvent and then performing the chemical analysis. With the new system, leaves are examined and then left functional and intact.

The research, led by Aydogan Ozcan, associate director of the UCLA California NanoSystems Institute and Chancellor's Professor of Electrical Engineering and Bioengineering at the UCLA Henry Samueli School of Engineering and Applied Science, was published online by the Royal Society of Chemistry journal Lab on a Chip.

The system developed by Ozcan's lab uses an image captured by the Google Glass camera to measure the chlorophyll's light absorption in the green part of the optical spectrum.

The main body of the hand-held illuminator unit can be produced using 3-D printing and it runs on three AAA batteries; with a small circuit board added, it can be assembled for less than $30.

Held behind the leaf, facing the Glass wearer, the illuminator emits light that enhances the leaf's transmission image contrast, indoors or out, regardless of environmental lighting conditions.

The wearer can control the device using the Google Glass touch control pad or with the voice command, ''Okay, Glass, image a leaf.'' The Glass photographs the leaf and sends an enhanced image wirelessly to a remote server, which processes the data from the image and sends back a chlorophyll concentration reading, all in less than 10 seconds.

''One pleasant surprise we found was that we used five leaf species to calibrate our system, and that this same calibration worked to accurately detect chlorophyll concentration in 15 different leaf species without having to recalibrate the app,'' Ozcan said. ''This will allow a scientist to get readings walking from plant to plant in a field of crops, or look at many different plants in a drought-plagued area and accumulate plant health data very quickly.''

The Google Glass app and illuminator unit could replace relatively costly and bulky laboratory instruments. Ozcan said that the convenience, speed and cost-effectiveness of the new system could aid scientists studying the effects of droughts and climate change in remote areas.

Ozcan's laboratory specialises in computational imaging, sensing and diagnostic devices for various mobile-health and telemedicine applications. Its previous work includes quick analysis of food samples for allergens, water samples for heavy metals and bacteria and cell counts in blood samples.

The research team has devised a way to use Google Glass to process diagnostic test results, and an app and attachment that converts a smartphone into a fluorescence microscope for imaging single viruses and individual DNA molecules.

The study's first author was Bingen Cortazar, a UCLA graduate student; co-authors were Hatice Ceylan Koydemir, a UCLA postdoctoral scholar, and Derek Tseng and Steve Feng, researchers in Ozcan's lab.

Support for Ozcan's lab is provided by the Presidential Early Career Award for Scientists and Engineers, the Army Research Office Life Sciences Division, the National Science Foundation, the Office of Naval Research, the Howard Hughes Medical Institute and the National Institutes of Health.