News

Sensor Fusion for Autonomous Vehicles: Interview with Tactile Mobility CEO Amit Nisenbaum

January 17, 2019 by Mark Hughes

An interview with Amit Nisenbaum, CEO of Tactile Mobility, a company that achieves software sensor fusion to make sensor data useful for autonomous vehicles.

We often talk about the vision systems for autonomous vehicles, but what about the sensor systems that gather data where the rubber meets the road? Tactile Mobility CEO Amit Nisenbaum discusses the sensor fusion that goes into processing data from tactile sensors in self-driving cars.

I met Amit Nisenbaum, CEO of Tactile Mobility, at CES 2019 last week at the Venetian Hotel. Tactile Mobility is based in Haifi, Israel and has offices in Los Altos, California. Tactile Mobility “enables smart and autonomous vehicles with a sense of touch.” 

But Tactile Mobility does not deal in sensors directly, instead focusing on the process of sensor fusion for very specific sensor types.

Here are some insights from Nisenbaum on the relationship between autonomous vehicles, data, and the future of the automotive industry.

Tactile Sensors vs. Vision Sensors

Most of the time when we think of autonomous vehicle sensors, we first think of camera or LiDAR systems, sensors that provide a visual map of the environment around the vehicle.

But this is far from the only kind of sensor needed to make autonomous vehicles run.

 

The Tesla Model X. Tesla Autopilot's informational page focuses only on camera and vision systems, but that's not the whole story. Image courtesy of Tesla.

 

“Computers [inside autonomous vehicles] need visual and tactile sensors," says Nisenbaum. "Many companies are focusing on the visual side; we augment those capabilities with tactility.”

Tactility refers to the ability to sense the physical environment—the sense of touch. Tactile sensors are, then, those that measure the physical world, including pressure sensors, force sensors, vibration sensors, and others. These sensors may be capacitive, piezoelectric, piezoresistive, or elastoresistive in nature, but all gather data on some measurement of physical objects interacting.

But Tactile Mobility doesn't develop tactile sensors, either. Instead, Nisenbaum says, they focus on sensor fusion via specialized software and proprietary algorithms.

“We are not a sensors company; we do not require additional sensors. We have software that we embed in or on one of the vehicle’s computers. That software collects data from multiple existing non-visual sensors, such as wheel speed sensors, wheel angle sensors, RPM, accelerometers. We fuse that data to create a master signal that represents, in real time, the dynamic between the vehicle and the road. We process the signal to clean it and then a apply proprietary algorithm to derive actionable insights.”

So where does this data go?

Embedded vs. Cloud Software

Nisenbaum says that their company operates "behind the scenes" working with the computers that actually consume the data Tactile Mobility generates.

“It’s important to emphasize that our software stack is comprised of embedded software and cloud software, and those two components can act independently. For instance, an OEM can engage with us in order to embed our software in their vehicles and do nothing with the cloud." 

This model is becoming a more familiar one, in which a developer creates a product that can either supply raw data to a designer to use in-house or process the data for them. Even hardware developers are increasingly offering data processing environments to offer more support to designers, rather than forcing them to develop their own processing algorithms.

 

Vehicle DNA and Surface DNA: Gathering Data on Vehicles and Roads

For Tactical Mobility, this cloud portion of their system deals specifically with automotive applications, down to where the rubber literally meets the road. 

"In the cloud, we take data from the vehicles and we break that signal into two mathematical models that describe the two elements that created the tactility." By this, he means the elements of the vehicle or sensor and the environment. "We know that the grip level is ‘x’, but we don’t know how much of it came from the vehicle and what was the contribution of the road."

 

Image from Mobile Tactility

 

The solution to this conundrum? Gather data on the roads.

"In the cloud, we take the more than 14 million kilometers (of road) that we have been collecting and analyzing continuously and we create models (that number grows by about 500,000 kilometers a month). We create two mathematical models: one describes the unique vehicle down to the VIN—we call it Vehicle DNA. The second model describes the road and segments in the road—we call that Surface DNA. Those names signify that the models are specific to the road and to the vehicle."

Developing these models for processing data unsurprisingly requires a lot of data, in and of itself: "For Vehicle DNA, we take multiple drives of the same vehicle over the same road. Each drive will have a different signal, but there will be some commonality. When you lay them one on top of another, there will be [a] common signal, and perhaps some oscillations or differences of the road on top of that. We know how to ‘net out’, how to clean those additional parts of the signal. What we are left with is data about the vehicle. We can then tell things such as tire health, engine efficiency, brake-pad health, etc."

While this information is incredibly important for functional vehicles, it's also useful in other ways. 

Looking to the Future: Data Types and Applications

The “It creates a new category of data. That data is applicable in many use cases for several customers. OEMs get real-time data. We also take vehicle DNA and surface DNA from the cloud and we download it back to the vehicle. Over time, tires wear out, engines become less efficient. For each vehicle, we download the surface DNA of the surroundings. We have normalized grip level (only due to the road) and we have an anticipated grip level for this specific vehicle on the road ahead.”

Based on that information, vehicles may be designed differently in the future. “OEMs tell us we would like to know how people drive our vehicles in reality and how our vehicles are behaving in reality. This is for project strategy and improvement.” 

On a larger scale, the data may also change the way that we design and maintain our infrastructure. According to Nisenbaum, “Road authorities and municipalities are able to do better-planned maintenance and real-time obstacle detection."

In fact, Nisembaum says that this is already in process, "We are already working with Haifi and several other municipalities I cannot name in several countries around the world.”


 

As cars become increasingly automated, we need to provide their computers with as much information as possible to deliver their passengers and cargo safely from one location to another. 

Tactile Mobility’s software is intended to help vehicle manufacturers determine safe driving speeds and conditions on particular roads, as well as prescribe preventative maintenance. For municipalities, the data can advise them of road segments in need of repair.

We move one day closer to the future every day—and that future depends on data.