Fog is dangerous; sometimes deadly. The National Weather Service estimates that 440 people die in airplane crashes every year while flying in limited visibility. Another 600 deaths and 38,700 injuries occur each year in car accidents due to driving in fog, according to the Federal Highway Administration. But the Department of Energy’s Sandia National Laboratory is working on a solution: sensor systems that could see through fog cover to guide vehicles and their passengers to safety.

Cutting through the Fog

Sandia researchers, partnering with a team from NASA, have spent the last three years testing new sensors made by sensor- and imaging-technology company Teledyne FLIR. The researchers lay the devices out in a large 180-foot by 10-foot chamber that they proceed to fill with fog from fog machines. These machines spray a mix of water and salt, which makes accumulating clouds of moisture that become so dense that a human standing in the chamber can’t see the walls or objects sitting in front of them.

In these blinding clouds, the sensors’ mechanical eyes capture light as it travels through the fog, and gather reams of data on precisely how the fog particles obstruct or scatter the light. Then they relay this information to computers in another room, where researchers working the computers process it to generate images and locations of objects in the fog chamber.

Using this data, they hope, could help them create imaging systems that could improve a self-driving plane or cars’ “sensing and situational awareness capabilities,” said Brian Bentz, the project lead. He looks forward to vehicles using the sensors to detect—and avoid crashing into—any fog-enshrouded objects up ahead of them.

Game Changer for Military Missions

Aerial drones that can navigate through fog during their recon missions for the military or Government are one high priority for the researchers. They envision the aircraft relying on these fog-adapted sensors to “see” amid conditions that would have human pilots flying blind.

“We want to make sure these vehicles are able to operate safely in our airspace,” said Nick Cramer, the project’s lead NASA engineer. “This technology will replace a pilot’s eyes, and we need to be able to do that in all types of weather.”

In one recent test, the researchers made an aerial drone hover in place in the foggy room and then tested sensors, laid out at various locations throughout the chamber, to see how well the sensors could perceive the drone through the fog. Cramer said that the researchers are also assessing how each sensor’s distance from the object affects its ability to see it.

Aerial drones that use these sensors will likely be sharing the airspace with other drones, or even flying in group formations with them, Cramer noted. He said that fine-tuning the sensors’ ability to detect other objects in the air at varying distances will be essential if we want to keep the drones from colliding—and for this, he considers the Sandia fog chamber an excellent testing ground.

“The fog chamber at Sandia National Laboratories is incredibly important for this test,” Cramer said. “It allows us to really tune in the parameters and look at variations over long distances. We can replicate long distances and various types of fog that are relevant to the aerospace environment.”

Making the Roads and Skies Safer for Humans, Too

This technology won’t just protect flying robots, though; it could be ultimately lifesaving for humans. Aerial drones don’t just fly military missions, after all. They also run recon after earthquakes and other natural disasters to look for survivors amid the rubble. A drone that can see better when visibility is low will enable emergency-responders to rescue more injured people, more quickly.

And this technology will sooner or later make its way into self-driving cars. Human drivers who realize that they cannot see in front of them could switch into self-driving mode and let the car take over. Its sensors will peer through the haze and detect any oncoming vehicles or objects that the human eye cannot.

“It’s important to improve optical sensors to better perceive and identify objects through fog to protect human life, prevent property damage, and enable new technologies and capabilities,” said Jeremy Wright, optical engineer.

A Valuable Private-Public Partnership

Teledyne FLIR has been in the imaging-technology game for quite some time and has an array of camera and sensor tools to show for it. But its researchers credit Sandia and its fog facility with making their research into fog-vision technology much easier. The company also used this same fog chamber to test their new infrared cameras’ fog-vision capabilities, for example.

“Fog testing is very difficult to do in nature because it is so fleeting and there are many inherent differences typically seen in water droplet sizes, consistency and repeatability of fog or mist,” said Chris Posch, director of automotive engineering for Teledyne FLIR. “As the Sandia fog facility can repeatedly create fog with various water content and size, the facility was critical in gathering the test data (for the infrared cameras) in a thorough scientific manner.”

 

Related News

Rick Docksai is a Department of Defense writer-editor who covers defense, public policy, and science and technology news. He earned a Master's Degree in Journalism from the University of Maryland in 2007.