With U.S. traffic fatalities still high, vulnerable road users or VRUs are a concern for autonomous vehicle developers. Owl Autonomous Imaging Inc. yesterday announced the availability of “New Regulations for Cars to Protect Pedestrians at Night.” The whitepaper examines the status of the latest international, automotive industry, and insurance regulations that will affect how automakers design and build vehicles.
“If you are involved in the development of next-generation autonomous vehicle safety and ADAS [advanced driver-assist systems], it is essential for you to understand how the industry will be changing over the next decade,” stated Chuck Gershman, co-founder and CEO of Owl. “Safety, especially night-time driving safety, is a critical milestone for the automotive industry’s next-generation vehicles.”
Fairport, N.Y.-based Owl Autonomous Imaging delivers monocular 3D thermal ranging computer vision systems to the automotive and industrial mobility markets. The company claimed that they can enhance safety by day or night and in adverse weather conditions. It has 16 patents.
Whitepaper tracks regulatory efforts
Around the world, government agencies and industry organizations are developing and implementing safety regulations, noted Owl Autonomous Imaging. The new regulations could force vehicle makers to adopt safety and night-time driving technologies that are more effective than those in use today, it said.
This new whitepaper examines the ever-changing state of various regulations and their potential impact on pedestrian emergency braking systems (PAEB). Owl said it specified emerging technologies to improve road safety for pedestrians, bicycles, motorcycles, etc.
Owl AI sees 3D, in the dark
Owl Autonomous Imaging's Thermal Ranger is a passive 3D sensor that uses deep learning and custom thermal sensors to extract dense range maps. The company said its artificial intelligence can identify, classify, and determine the distance to living objects in dense urban environments as well as dark country roads.
Owl AI is intended to allow drivers and autonomous vehicles to safely navigate and stop to avoid catastrophic damage or injury. In February, the company released a whitepaper explaining how Thermal Ranger uses convolutional neural networks (CNNs) to locate and identify the thermal signatures of pedestrians and animals in the dark with a single infrared camera.
“The technology stack includes everything from pre-processing, sensor fusion, localization data, and decision making to the actuator system,” Gershman told Robotics 24/7. “We built the sensor and reference camera system and the perception stack for that sensor-specific interpretation of the data. We can get a true 3D response from 2D images, and we've built a high-definition digital thermal sensor.”
The CNN builds a disparity map, like stereo vision, in which the pixel disparity is converted to depth measurement, he explained. Owl's API can convert data sets for sensor fusion, but the company is not building perception systems or bounding boxes, which take longer, said Gershman.
Sensor fusion for safety
Cost, resolution, and the ability to integrate are barriers, but Owl's thermal sensor is 200 times cheaper than cameras, and its computer vision algorithms make it easy to integrate, Gershman said. Sensor fusion provides richer data than relying just on lidar, claimed Gershman.
“For example, finding deer works better with our algorithms and sensor fusion between color and thermal imaging,” he said. “Also, RGB plus thermal helps spot pedestrians in chaotic urban environments with lots of lights, such as Las Vegas.”
Owl AI's goal is to bring down resale costs to $150 for Tier 1 automotive OEMs, according to Gershman.
“The cheapest thermal camera today has a sensor board, an ISP [image signal processor] for image correction, and a third board for the interface,” he said. “We turn a three-board system into a one-board system.”
“Cameras are still cheapest, with radar next, and 4D is more expensive,” Gershman noted. “Then there's thermal and finally lidar, which starts at $5,000, depending on the distance. Lidar is not a unilateral solution.”
From space to the street
“Two of our cameras are in outer space—one for NASA, and one for Google Earth,” Gershman said. “Seven years ago, we entered into a proof-of-concept contract with the U.S. Air Force for a challenge. We're the only one who built a camera that met their needs.”
“The Air Force used thermal imaging and computer vision at resolutions beyond what was commercially available,” he recalled. “They can be applied to pure autonomy, but in the near term, we're focusing on ADAS for our go-to-market perspective.”
Now, Owl Autonomous Imaging is in discussions with nearly 80 companies, including trucking, ADAS, and self-driving vehicle makers. Are there regional differences in demand for vehicle imaging systems?
“The German market understands lidar limitations and the value proposition of seeing deer at night, so it is pushing hardest to test and get down the learning curve, followed by the U.S.,” replied Gershman. “We're also talking with one sensor player in South Korea and have just started to do road shows in Japan.”
Other sectors interested in Owl's technology include agriculture and transportation.
“Combine makers want them to run 24/7, but fields can be really dusty. Cameras and lidar don't work well in dust, and radar works but doesn't know where it is,” said Gershman. “Automotive wants military-grade performance and better-than-military reliability at no cost.”
“Some robotaxi companies have already deployed test vehicles with Owl's HD thermal system,” he added.
About the Author
Follow Robotics 24/7 on Linkedin