News

The Tiny Radar Chip Revolutionizing Gesture Recognition: Google ATAP’s Project Soli

June 15, 2016 by Kate Smith

Google ATAP is bringing touchless interface to your smartwatch— and maybe everywhere else, too. This is Project Soli.

Google ATAP is bringing touchless interfaces to the market using a miniaturized radar chip no bigger than a dime. This is Project Soli.

Google ATAP at Google I/O 2016

Project Soli made a big splash at last month’s Google I/O 2016 conference, even more so than its initial unveiling at Google I/O the year before. Soli is being developed by Google’s Advanced Technologies and Projects group, or Google ATAP, which tends to focus on mobile hardware technology.

Google ATAP differs from its kindred Alphabet sibling, X, in that it incubates projects for only two years where X develops their projects longer, such as their six-year work on self-driving cars. This means that each and every project Google ATAP undertakes is rocketed through development and even production far faster than almost any other incubation process in the world.

The project founder, Ivan Poupyrev, introduced both Project Soli and Project Jacquard (which deals with wearable smart fabrics) at this year's gathering.

In his presentation of Soli's capabilities, machine learning engineer Nick Gillian showed off Soli’s capabilities in the context of smart watch control. With Soli embedded in a smart watch form factor, he showed how tiny gestures and distance between hand and smart watch can allow touchless interface to check texts and the weather.

They also demonstrated using the same gesture control tech in a speaker form factor that used the same algorithms as the smart watch.

The Soli Sensor

Soli’s radar sensor is a marvel in many respects. For one thing, it solves a long-lived issue when it comes to gesture-recognition technology. Previous forays into the topic yielded almost-answers such as stereo cameras (which have difficulty understanding the overlap of fingers, e.g.) and capacitive touch sensing (which struggles to interpret motion in a 3D context).

Google ATAP’s answer is radar.

Radar is capable of interpreting objects’ position and motion even through other objects, making it perfect for developing a sensor that can be embedded in different kinds of devices like smartphones.

The difficulty was that radar hardware is too large for wearable applications. Way too large. Even the scaled-down early prototypes ATAP developed were about the size of a briefcase.

 

Hardware prototypes from the past two years of development. Image courtesy of Google ATAP.

 

However, after several iterations, the current model is only 8mm x 10mm: smaller than a dime. And that’s including the antenna array.

 

The radar chip held by Ivan Poupyrev. Image courtesy of Google I/O 2016.

 

By the by, this radar tech went from the size of a briefcase to the size of a dime in a span of ten months.

The Soli embedded system, developed in partnership with Infineon, is also a large stride forward. For comparison, evaluating normal radar information often requires the use of a supercomputer.

 

Image courtesy of Google I/O 2016.

 

The Soli evaluation board, itself, has two transmit antennas and four receive antennas. So when it comes to taking Soli to its possible applications, the board is easy to develop with.

 

Image courtesy of Google I/O 2016.

Radar Sensors and Algorithmic Interpretation

In order to achieve this, the device puts out a broad beam that is reflected back by the hand in front of it. The sensor then receives motion as a series of reflections. This signal changes as the hand moves in space, reflecting back different patterns.

Motion signals detected by the radar chip are transformed into multiple representations, including range doppler, which helps map the location of a hand by its velocity and distance from the sensor.

From these representations, engineers can extract features, such as "range doppler distance". The features are then passed to machine learning algorithms, which interpret them and approximate hand motions based on the signals received.

The SDK pipeline outputs this information between 100 and 10,000 frames per second.

Minute Gesture Control

All of this leads up to a device that's capable of categorizing and interpreting gestures.

In their promotional videos for Project Soli, the tech is shown recognizing gestures in real time. A pinching motion can indicate the press of a button.

Rubbing a finger along the length of a thumb can indicate moving a slider, say for volume control.

Pulling your thumb across your pointer knuckle simulates dragging a screen.

And rubbing the tip of your finger and thumb together can indicate turning a dial.

Endless Applications— Results of the Alpha Release

There are many, many applications for this technology— from civil engineering (sensing cars at a stop light without needing pressure plates) to communications (including more precise sign language interpretation) to medical applications (more fine control for surgical robotics).

For now, however, ATAP has already had an alpha release in October of 2015. They gave about 60 developers in 14 countries access to a Soli development kit, complete with hardware and software.

 

Image courtesy of Google I/O 2016.

 

Here are a couple of highlights from the results:

  • In the UK, the University of St. Andrews harnessed Soli for object recognition, using it to accurately predict what's placed on the sensor, such as copper vs. steel and milk vs. water.
  • Also in the UK, Goldsmiths University of London was used to develop a visualization interface (that looks excitingly similar to Tony Stark's gesture-based visualization in Iron Man). 
  • In Spain, B-Reel used Soli to create a gesture-based security system, where a custom gesture serves as a password.
  • And Japan's Semitransparent Design used Soli for predictive drawing.

As you can see, there are many doors that can be opened when Soli is released to developers in general.

According to Poupyrev, he's looking forward to seeing what the development community can do with Soli, saying "I really want them to be excited and motivated to do something cool with it."

For the beta release, ATAP intends to also release an API with the representation, feature, and gesture label data they've built. 

All gif material used courtesy of Google ATAP.