News

Tactile Sensing Gets a Boost for AI Systems Thanks to Open-source Research

November 03, 2021 by Jake Hertz

AI system sensing keeps pushing to new levels lately, especially as more research keeps pouring in concerning tactile sensing. One contributor is Facebook or "Meta" with its DIGIT and ReSkin technology.

While humans have five senses, today's AI systems are primarily based on only two: sound and sight. Out of the missing senses, touch is arguably receiving the most attention, where applications like robotics and VR could benefit significantly from tactile sensing. 

Often a pioneer in the field, Facebook (possibly rebranding as "Meta") has recently announced a series of innovations in tactile sensing for AI systems. This article will discuss its breakthroughs, DIGIT and, most recently, ReSkin, to see how Meta hopes to bring a third sense to the world of AI. 

 

DIGIT Pushes Forward AI Tactile Sensing

Before getting into ReSkin, it is worth noting that Meta has a history of research innovation in tactile sensing. Most notably, the company released DIGIT, a low-cost, high-resolution tactile sensor, back in January 2020. 

 

DIGIT sensor expanded view.

DIGIT sensor expanded view. Image used courtesy of Facebook

 

DIGIT is a vision-based tactile sensor consisting of a gel-elastomer, shaped like the tip of a finger, housed inside a plastic square. Along with the elastomer, DIGIT consists of sensors, a camera, and a series of LEDs. 

When touching an object with DIGIT, the elastomer causes a change in colors in the camera's image, which can then be interpreted as tactile sensing information. 

DIGIT introduced several significant improvements to previous tactile sensors at the time of release, including a smaller form factor, improved manufacturability, and enhanced mechanical reliability. 

This week, Meta announced that DIGIT will now be commercially manufactured in a partnership with MIT spin-off GelSight

 

ReSkin: A Cost-effective Sensing Solution

Building on the foundation of DIGIT, the new announcement from Meta this week was ReSkin, a new open-source tactile sensing "skin" solution. 

Developed in collaboration with researchers at Carnegie Mellon University, ReSkin is a deformable elastomer designed with embedded magnetic particles. 

According to the publication, when this material deforms, it causes a change in its magnetic field, which is perceptible by nearby magnetometers. The system then uses AI-based techniques to interpret this data into relevant information such as contact location and applied force. 

 

Representation of ReSkin reading changes in its internal magnetic field to perform tactile sensing.

Representation of ReSkin reading changes in its internal magnetic field to perform tactile sensing. Image used courtesy of Facebook/Meta

 

All together, ReSkin comes out to be 2-3 mm thick, has a lifetime of over 50,000 interactions, has a temporal resolution of up to 400 Hz, and has a spatial resolution of 1 mm with 90% accuracy.

 

Example showing how. ReSkin provides significantly improved tactile sensing while picking up a blueberry.

Example showing how. ReSkin provides significantly improved tactile sensing while picking up a blueberry. Image used courtesy of Facebook/Meta

 

ReSkin offers several unique value propositions. First off, unlike other skin-based sensors, ReSkin is generalizable thanks to a novel self-supervised learning algorithm that allows the sensors to auto-calibrate without manual intervention. 

On top of this, since ReSkin reads changing magnetic fields, it removes the need for direct contact with the sensors but instead works based on proximity. 

Finally, the system is very cheap, costing less than $6 USD each at 100 units, making it easily accessible and employable in most systems. 

 

Paving the Way With Open-source Technology

For ReSkin, Meta’s plans moving forward are to release the design, relevant documentation, code, and base models to the public. Its hopes are to allow AI researchers to use ReSkin in their own research without needing to collect or train their own data sets, with the ultimate hope being that tactile sensing will be developed and brought to scale in the near future. With open-source technology such as this, the door can be easily opened to even better and more advanced systems.