Interface Ecology Lab unveils contactless ZeroTouch technology

16 May 2011

1

Texas A&M University's Interface Ecology Lab exhibited ZeroTouch, a unique optical multi-finger sensing technology, at the ACM CHI 2011 Conference at Vancouver that ended on 12 May.

ZeroTouch allows for zero-force, zero-thickness, completely transparent multi-touch sensing. Unique characteristics enable trivial display integration as compared to other optical multi-touch solutions. ZeroTouch provides a high-framerate, high-resolution solution for robust optical multi-touch sensing in a thin, transparent form-factor.

On the other hand, MultiTouch technology requires people to use touch commands that require multiple fingers. The iPhones can sense changes at each point along the touchscreen's grid. In other words, each point on the grid creates its own signal when it is touch and sends it to the iPhone's processor. This allows the phone to determine the location and movement of simultaneous touches at multiple points on the touchscreen.

In comparison with the popular capacitive sensors of Apple's iPhone and iPad, interaction via ZeroTouch requires no pushing by the hand and fingers, greatly reducing muscle fatigue. New forms of free air interaction are also enabled, with more precision, for example, than with Microsoft's Kinect.

In contrast, a touchscreen which uses Touch technology is an electronic visual display that is capable of detecting the presence and location of a touch within the display area. The term is generally used with reference to touching the display of a device with a finger or hand. Touchscreens also detect contact by other passive objects, such as a stylus. Touchscreen technology is commonly used in devices such as tablet computers, and smartphones. However, some touchscreens use capacitive material which only works when touched by hand.

At the 2011 Conference on Human Factors in Computing Systems, the premier venue for cutting-edge HCI research and technologies, Texas A&M's Interface Ecology Lab had presented ZeroTouch in an interactive exhibit.

Dr. Andruid Kerne from the department of computer science and engineering said the group is showcasing the unique capabilities of this new natural user interface sensing modality with three applications: 

  •  intangibleCanvas uses the ZeroTouch sensor as a precision free-air interactive input modality, allowing users to reach through the sensor and paint on a projected screen. The embodied interaction enables painting with the elbows, the arms and the head as well as the fingers. intangibleCanvas affords control over brush style, color, and ink flow through a multimodal iPhone interface held in the nondominant hand.
  • Hand + Pen in Hand Command, a multitouch and stylus enabled real-time strategy game. The combination of pen and touch allows for completely new ways of interacting with the game, enabling the user to directly manipulate the map and control and direct units with a level of precision not found in traditional rts interaction. The dominant hand fluidly switches between stylus and direct multitouch interaction, while the nondominant hand uses multitouch to activate command modes. It is built on the open-source Zero-K game engine.
  • ArtPiles, a new curatorial tool for museums and art galleries that gives curators new ways to manipulate large collections of art works when designing exhibits, and historians new ways to organize the collections. Each art work is represented by an image, enhanced with descriptive metadata. The metadata is derived with the Interface Ecology Lab's open source meta-metadata language and architecture. ArtPiles' combination of pen and multi-touch interaction enables new visual and semantic manipulation of the art collection, which are not possible with the pen or touch modalities alone. This research integrates the fields of information semantics, information visualization and interaction design.

ZeroTouch technology, uses hundreds of modulated infrared sensors and several infrared LEDs that create a series of invisible light beams crossing the screen. The interruption of the beams means something has touched the screen, and researchers can visualize the interruption of the beams to reconstruct the visual outline or appearance of any objects inside the sensor frame.

Latest articles

Global Chip Sales Expected to Hit $1 Trillion This Year, Industry Group Says

Global Chip Sales Expected to Hit $1 Trillion This Year, Industry Group Says

Citi to Match Government Seed Funding for Children’s ‘Trump Accounts’

Citi to Match Government Seed Funding for Children’s ‘Trump Accounts’

Huawei-Backed Aito Partners With UAE Dealer to Enter Middle East Market

Huawei-Backed Aito Partners With UAE Dealer to Enter Middle East Market

AI is No Bubble: Nvidia Supplier Wistron Sees Order Surge Through 2027

AI is No Bubble: Nvidia Supplier Wistron Sees Order Surge Through 2027

Tech Selloff Weighs on Asian Markets; Indonesia Slides After Moody’s Outlook Cut

Tech Selloff Weighs on Asian Markets; Indonesia Slides After Moody’s Outlook Cut

Amazon Plans $200 Billion AI Spending Surge; Shares Slide on Investor Jitters

Amazon Plans $200 Billion AI Spending Surge; Shares Slide on Investor Jitters

Server CPU Shortages Grip China as AI Boom Strains Intel and AMD Supply Chains

Server CPU Shortages Grip China as AI Boom Strains Intel and AMD Supply Chains

OpenAI launches ‘Frontier’ AI agent platform in enterprise push

OpenAI launches ‘Frontier’ AI agent platform in enterprise push

Toyota set for third straight quarterly profit drop as costs and tariffs weigh

Toyota set for third straight quarterly profit drop as costs and tariffs weigh