Goyal team’s passive 3D imaging method can map distances, determine object materials, and more

By Patrick L. Kennedy

When you look out your window at night, you expect to see objects—a tree, a neighbor’s house—illuminated by street lamps or moonlight. If there were a power outage on a moonless night, you’d see only darkness.

That doesn’t mean there’s no light out there, though. “There is light,” says Professor Vivek Goyal (ECE). “It’s just at wavelengths that you can’t see with the naked eye.”

With the aid of an ordinary thermal camera or night vision goggles, you could see something—at least the outlines of nearby objects. But Goyal says a much richer sense of the surroundings can be gleaned from that invisible-to-us light, and his team is developing the more sophisticated data processing needed to do it. Someday, their 3D imaging technology—which they recently reported in an IEEE journal—might be used for mapping and navigation for autonomous vehicles, among other applications.

Vivek Goyal in his office
Vivek Goyal (ECE)

“You can infer distance,” says Goyal. “The atmosphere is not only absorbing light but also emitting light, as a function of wavelength, and we can mathematically model that. There’s different absorption at different wavelengths as light travels through the air, so light that’s traveled a longer distance has a different spectrum than light that was emitted very close to you.”

Goyal and colleagues have begun successfully picking up distance cues by passively measuring thermal radiation at these various wavelengths that are too long for the naked eye. Their sensor technology is passive in the sense that it detects light, but doesn’t emit light.

“The work was initially funded by DARPA to support autonomous navigation in the dark while remaining stealthy,” explains Goyal, who continued building upon the work for more general applications after completing the defense project. “You can put lidar [a laser imaging, detection, and ranging system] on an autonomous vehicle, but lidar is not stealthy—it’s emitting light all the time.”

For Goyal, the work of traditional thermal imaging is almost “too easy,” he says. “A lot of the prior work was related to the Air Force, where they studied tracking a missile or an airplane—something much hotter than the atmosphere. We want to be able to use this absorption principle to do ranging [determining distances] for scenes where the objects are not necessarily hotter than the air at all—in fact, the objects could be colder than the air.”

In addition to mapping distances, Goyal’s team believes their passive 3D sensing methods might also determine the materials of objects, air temperature, humidity, and gas concentrations—all of which could aid in navigation.

“We separate out the effects of material and temperature,” Goyal says. “So if an autonomous vehicle is navigating at night, and an obstacle is just about the same temperature as the road, it would look the same to an ordinary thermal camera, whereas our sensor would discern the difference and be able to navigate around it.”

The students and postdocs in Goyal’s lab hail from disciplines including computer science, materials science, electrical engineering, and computer engineering, and his colleagues include researchers at MIT, the National Institute of Standards and Technology, and the Jet Propulsion Laboratory.

“Research is so social,” says Goyal. “A lot of it has to do with connecting with people with the same interests.”