Purdue 3D sensor sees depth and texture in pitch darkness
By DE StaffAutomation Automotive Machine Building
University’s AI-powered HADAR sensor could enable robots and autonomous vehicles in low visibility conditions.
Purdue University researchers announced the development of a 3D machine vision sensor that can perceive texture and depth data in low or no light environment. Called HADAR (heat-assisted detection and ranging), the technology combines thermal physics, infrared imaging and machine learning in a way that paves the way to fully passive and physics-aware machine perception, the researchers say.
“Our work builds the information theoretic foundations of thermal perception to show that pitch darkness carries the same amount of information as broad daylight,” said Zubin Jacob, the Elmore Associate Professor of Electrical and Computer Engineering in the Elmore Family School of Electrical and Computer Engineering. “Evolution has made human beings biased toward the daytime. Machine perception of the future will overcome this long-standing dichotomy between day and night.”
Traditional thermal imaging is a fully passive sensing method that collects invisible heat radiation originating from all objects in a scene. Like HADAR, it can sense through darkness, inclement weather and solar glare, but is hindered by fundamental challenges, the researchers said.
“Objects and their environment constantly emit and scatter thermal radiation, leading to textureless images famously known as the ‘ghosting effect,’” said research scientist Fanglin Bao, who co-developed HADAR with Jacobs. “This loss of information, texture and features is a roadblock for machine perception using heat radiation.”
Similarly, LiDAR (light detection and ranging) radar and sonar emit signals and subsequently receive them to collect 3D information about a scene. However, the researchers said, these methods suffer from signal interference as they are scaled up. Video cameras depend on sunlight or other sources of illumination and therefore don’t work well in low-light conditions, fog and rain.
“HADAR vividly recovers the texture from the cluttered heat signal and accurately disentangles temperature, emissivity and texture, or TeX, of all objects in a scene,” Bao said. “It sees texture and depth through the darkness as if it were day and also perceives physical attributes beyond RGB, or red, green and blue, visible imaging or conventional thermal sensing.”
Initially, the researchers foresee their HADAR TeX vision applied in automated vehicles and robots that interact with humans in complex environments. The technology could be further developed for agriculture, defense, geosciences, health care and wildlife monitoring applications. However, they say the size of the hardware and the data collection speed will need to be improved first.
“The current sensor is large and heavy since HADAR algorithms require many colors of invisible infrared radiation,” Bao said. “To apply it to self-driving cars or robots, we need to bring down the size and price while also making the cameras faster. The current sensor takes around one second to create one image, but for autonomous cars we need around 30 to 60 hertz frame rate, or frames per second.”