Ultrafast camera for self-driving vehicles and drones

16 Feb 2017

Scientists from Nanyang Technological University, Singapore (NTU Singapore) have developed an ultrafast high-contrast camera that could help self-driving cars and drones see better in extreme road conditions and in bad weather.

Unlike typical optical cameras, which can be blinded by bright light and unable to make out details in the dark, NTU's new smart camera can record the slightest movements and objects in real time.

The new camera records the changes in light intensity between scenes at nanosecond intervals, much faster than conventional video, and it stores the images in a data format that is many times smaller as well.

With a unique in-built circuit, the camera can do an instant analysis of the captured scenes, highlighting important objects and details.

Developed by Assistant Professor Chen Shoushun from NTU's School of Electrical and Electronic Engineering, the new camera named Celex is now in its final prototype phase.

"Our new camera can be a great safety tool for autonomous vehicles, since it can see very far ahead like optical cameras but without the time lag needed to analyse and process the video feed," explained Asst Prof Chen.

"With its continuous tracking feature and instant analysis of a scene, it complements existing optical and laser cameras and can help self-driving vehicles and drones avoid unexpected collisions that usually happens within seconds."

Chen unveiled the prototype of Celex last month at the 2017 IS&T International Symposium on Electronic Imaging (EI 2017) held in the United States.

It received positive feedback from the conference attendees, many of whom are academia and top industry players.

How it works
A typical camera sensor has several millions pixels, which are sensor sites that record light information and are used to form a resulting picture.

High-speed video cameras that record up to 120 frames or photos per second generate gigabytes of video data, which are then processed by a computer in order for self-driving vehicles to "see" and analyse their environment.

The more complex the environment, the slower the processing of the video data, leading to lag times between "seeing" the environment and the corresponding actions that the self-driving vehicle has to take.

To enable an instant processing of visual data, NTU's patent-pending camera records the changes between light intensity of individual pixels at its sensor, which reduces the data output. This avoids the needs to capture the whole scene like a photograph, thus increasing the camera's processing speed.

The camera sensor also has a built-in processor that can analyse the flow of data instantly to differentiate between the foreground objects and the background, also known as optical flow computation. This innovation allows self-driving vehicles more time to react to any oncoming vehicles or obstacles.

The research into the sensor technology started in 2009 and it has received $500,000 in funding from the Ministry of Education Tier 1 research grant and the Singapore-MIT Alliance for Research and Technology (SMART) Proof-of-Concept grant.

The technology was also published in two academic journals published by the Institute of Electrical and Electronics Engineers (IEEE), the world's largest technical professional organisation for the advancement of technology.

Commercialisation potential
With keen interest from the industry, Chen and his researchers have spun off a start-up company named Hillhouse Tech to commercialise the new camera technology. The start-up is incubated by NTUitive, NTU's innovation and enterprise company.

Asst Prof Chen expects that the new camera will be commercially ready by the end of this year, as they are already in talks with global electronic manufacturers.