News

Meet WARLORD: Metawave Aims to Bring Millimeter-Wave RADAR Sensors to the Automotive Industry

August 29, 2018 by Mark Hughes

We've heard all about LiDAR sensors for automotive applications. But what about RADAR? Metawave has developed a RADAR sensor, dubbed WARLORD, that CEO Dr. Maha Achour believes will eventually allow safer Level 4 and Level 5 autonomous vehicles.

LiDAR has been a quickly rising star in the sensing arena. RADAR sensors, however, may stand to give it a run for its money.

AAC's Mark Hughes spoke to Metawave’s Founder and CEO, Dr. Maha Achour, Ph.D. and Metawave's VP of Strategic Alliances Tim Curley to take a look at a how millimeter-wave RADAR sensors may unseat LiDAR as the future of automotive sensing.

The Current State of LiDAR

The first autonomous cars were built with mechanical LiDAR units affixed atop their roofs. These systems generate large datasets called point-clouds that are then passed to a computer for processing (3D SLAM). Inside the computer, advanced algorithms try to determine which objects are cars, people, trees, buildings, signs, etc… By watching the objects move over time, the central computer can determine velocity, bearing, and predict collisions.

Image showing mapping surrounding autonomous vehicle. Image used courtesy of Velodyne LiDAR

While these first LiDAR units were adequate for the early days of Level 1 and Level 2 autonomous driving, they were aesthetically obtrusive atop their parent vehicle, and they had the unforgivable sin of being prohibitively expensive. For LiDAR to enter the mass-marketplace, the per-unit price had to drop substantially—and the easiest way to realize that dream is to remove the rotating array and eliminate any macroscopic moving parts.

How Does WARLORD Work?

WARLORD has a custom antenna, created with custom materials, controlled by custom integrated circuits.

Active antenna created from adaptive metamaterials. Image used courtesy of Metawave

The signal from a single feedpoint is controlled with the custom ICs to provide an electrically steerable beam pattern.

Beam-pattern from an active RADAR antenna. Image used courtesy of Analog.

This active antenna configuration allows WARLORD to change its beam pattern at-will to create one or many lobes. This allows the system to simultaneously track multiple objects or to focus in on particular objects of interest. Narrow-beams allow for tracking objects with a smaller RADAR cross-section at a greater distance.

Challenges for RADAR in the Autonomous Vehicle Industry

It is impossible to predict the future as technology is still in the early stages of development. But mechanical rotary LiDAR units appear to have been rejected by OEMs en masse. LeddarTech, Innoviz, and Velodyne have solid-state LiDAR units that are currently being integrated into Level 3 autonomous vehicles. The cost of these units will continue to decrease and their performance will improve. However, all modern LiDAR and camera units suffer from the same critical issues—they have limited range and can be obstructed by debris.

By that same token, however, Dr. Achour says that there's only one real challenge remaining for millimeter-wave technology when it comes to hardware limitations:

"When you start doing beam-forming instead of just sending the signal everywhere, you put this digital wave on every single antenna or analog phase shifter. Now you're operating this array as a phase array antenna. The problem with this approach occurs if it's not being designed in concert together, if they are designed independently. As soon as [the matching antennas] start steering the beam, it goes way above 10 db. You have reflection coming from the antenna, and that reflection basically kills your PA, kills your IC, creates this thermal noise. So, these are talking about the limitations, not talking about the signal processing and the delay and the power consumption in doing this expensive digital signal processing."

Another challenge for autonomous vehicles as a whole is the concept of responsibility for the "decisions" that a car makes. Dr. Achour pointed out that multiple RADAR sensors may be "overkill for Level 3 cars" such as a Tesla because there is a driver who is responsible for the safety of car operation. "But when you go to Level 4 and 5," she says, "Well now the safety is the responsibility of the company that operates this fleet of cars. The profit is not just per car sold but is basically per mile driven. So it's a very different business model for both the car OEM and the service provider."

Metawave, however, does not claim to use its AI to make decisions at a vehicle level.

"Artificial intelligence covers a very broad functionality inside the car. So, if this is centralized, that means we have only one central processor that takes raw data and processes the whole thing. I think that the trend is going to be in doing what we call a hybrid or hybrid centralized and decentralized AI algorithm, so AI processing. Now you have each sensor provide some sort of labeling of these objects to the sensor fusion, and the sensor fusion does another layer of AI to decide 'Should I stay on this lane? Should I brake? Should I change lanes? What should I do?' We [at Metawave] don't do the sensor fusion and there are a lot of companies that do. In addition, all the car OEMs also want to own that sensor fusion because, in the end, this is the brain of the car and the company that has the smartest and safest brain is going to be the winner. We don't expect all of the players to survive a level four or level five challenge. Very few."

So if Metawave's AI isn't intended to perform sensor fusion and produce "decisions" to direct a vehicle's actions, what does the AI do?

"What we offer is an AI algorithm that sits only in the RADAR and only is responsible for processing the radar data and provide it with some level of confidence about the object. For example, if I see a truck maybe with 90% probability, I can provide that label to the sensor fusion, let's say at 300 meters. If I see a motorcycle at 300 meters because the cross-sections are smaller, I will provide it maybe with a 50% accuracy. Now, the sensor fusion will take this information and will instruct the LiDAR and the camera to look in the direction of the motorcycle instead of looking everywhere and wasting time just to verify is this really a motorcycle or not. By doing that, we provide the sensor fusion enough time to react before the car hits the motorcycle, and at the same the RADAR doesn't become liable of the final decision because we provide the long-range information."

Metawave also says it offers something unique to allow for better decision-making. "We give [OEMs, etc.] the option to have raw data. Today, none of the RADAR companies provide raw data. They only provide the two-point cloud, which is the range and the Doppler, just because it's a Level 2, Level 3 [application]. But if we provide them with the raw data, they can do whatever they want with it (and we provide them with the post-processed data of course on a different business model). Then, they have a very stronger platform to work with to make sure that the operation of the car is seamlessly maintained in any kind of operating condition, in any type of weather condition, and at the highest safety expectation."

What’s Next? Ambitions to Unseat LiDAR

Current ADAS (advanced driver-assistance systems) require cameras, LiDAR, and other sensor systems—all of which will almost certainly be necessary for Level 4 and Level 5 vehicles. But, Dr. Achour says, this may change in 10 to 15 years, once sensor fusion has further evolved. With sufficiently advanced RADAR sensors, ("with high-resolution imaging capability that is capable of operating in all weather conditions and all environments and also adding the non-line-of-sight detection and tracking, doing the V2V communication"), you may be able to avoid the need for short- and mid-range sensors at all.

"You add more functionality to the RADAR," she says. "You may not need these short-range and mid-range RADAR sensors. So you are eliminating other sensors."

Metawave is still refining their millimeter-wave RADAR technology, as are any other companies that haven’t made their presence known in the market yet. In a few years, when the RADAR-based-technology companies are ready for tier-1 integration, they might very well supplant the solid-state LiDAR that is all the rage today.