Interface Ecology Lab unveils contactless ZeroTouch technology

16 May 2011

Texas A&M University's Interface Ecology Lab exhibited ZeroTouch, a unique optical multi-finger sensing technology, at the ACM CHI 2011 Conference at Vancouver that ended on 12 May.

ZeroTouch allows for zero-force, zero-thickness, completely transparent multi-touch sensing. Unique characteristics enable trivial display integration as compared to other optical multi-touch solutions. ZeroTouch provides a high-framerate, high-resolution solution for robust optical multi-touch sensing in a thin, transparent form-factor.

On the other hand, MultiTouch technology requires people to use touch commands that require multiple fingers. The iPhones can sense changes at each point along the touchscreen's grid. In other words, each point on the grid creates its own signal when it is touch and sends it to the iPhone's processor. This allows the phone to determine the location and movement of simultaneous touches at multiple points on the touchscreen.

In comparison with the popular capacitive sensors of Apple's iPhone and iPad, interaction via ZeroTouch requires no pushing by the hand and fingers, greatly reducing muscle fatigue. New forms of free air interaction are also enabled, with more precision, for example, than with Microsoft's Kinect.

In contrast, a touchscreen which uses Touch technology is an electronic visual display that is capable of detecting the presence and location of a touch within the display area. The term is generally used with reference to touching the display of a device with a finger or hand. Touchscreens also detect contact by other passive objects, such as a stylus. Touchscreen technology is commonly used in devices such as tablet computers, and smartphones. However, some touchscreens use capacitive material which only works when touched by hand.

At the 2011 Conference on Human Factors in Computing Systems, the premier venue for cutting-edge HCI research and technologies, Texas A&M's Interface Ecology Lab had presented ZeroTouch in an interactive exhibit.

Dr. Andruid Kerne from the department of computer science and engineering said the group is showcasing the unique capabilities of this new natural user interface sensing modality with three applications: 

  •  intangibleCanvas uses the ZeroTouch sensor as a precision free-air interactive input modality, allowing users to reach through the sensor and paint on a projected screen. The embodied interaction enables painting with the elbows, the arms and the head as well as the fingers. intangibleCanvas affords control over brush style, color, and ink flow through a multimodal iPhone interface held in the nondominant hand.
  • Hand + Pen in Hand Command, a multitouch and stylus enabled real-time strategy game. The combination of pen and touch allows for completely new ways of interacting with the game, enabling the user to directly manipulate the map and control and direct units with a level of precision not found in traditional rts interaction. The dominant hand fluidly switches between stylus and direct multitouch interaction, while the nondominant hand uses multitouch to activate command modes. It is built on the open-source Zero-K game engine.
  • ArtPiles, a new curatorial tool for museums and art galleries that gives curators new ways to manipulate large collections of art works when designing exhibits, and historians new ways to organize the collections. Each art work is represented by an image, enhanced with descriptive metadata. The metadata is derived with the Interface Ecology Lab's open source meta-metadata language and architecture. ArtPiles' combination of pen and multi-touch interaction enables new visual and semantic manipulation of the art collection, which are not possible with the pen or touch modalities alone. This research integrates the fields of information semantics, information visualization and interaction design.

ZeroTouch technology, uses hundreds of modulated infrared sensors and several infrared LEDs that create a series of invisible light beams crossing the screen. The interruption of the beams means something has touched the screen, and researchers can visualize the interruption of the beams to reconstruct the visual outline or appearance of any objects inside the sensor frame.