Columbia Engineering Develops a Robotic Finger That Can Accurately Sense Touch
Researchers at Columbia Engineering have introduced a robotic finger that is capable of sensing and localizing touch with a high precision of under 1 mm, much like its human counterpart.
The researchers’ study, which was published online in IEEE/ASME Transactions on Mechatronics, describes the new approach that was taken to develop the robotic finder: It uses overlapping signals from light emitters an receivers which are embedded in a transparent waveguide layer covering the functional areas of the finger.
This approach solves the problems encountered with current methods for building touch sensors for robotic fingers, such as difficulty in covering multi-curved surfaces, a large number of wires, and difficulty fitting everything into small fingertips—all factors which have prevented touch sensors being integrated into robotic hands.
The optical tactical robot finger developed by Columbia Engineering researchers. Image credited to Pedro Piacenza, Columbia Engineering
Using Signal Data to Manipulate Touch Detection
The study also demonstrates how lots of rich signal data—which changes in response to deformation caused by touch pressure—can be obtained by measuring light transport between all the finger’s emitters and receivers.
This enables it to localize touch with a very high amount of precision over a large, multi-curved surface. The researchers then demonstrated that by using purely data-driven deep learning methods, useful information can be extracted from the data, including applied force and contact location, without the need to use analytical models.
The end result is a fully integrated, sensitive robotic finger with a low wire count. It is built using simple materials and accessible manufacturing methods and has been designed for easy integration into dextrous robotic hands.
"The human finger provides incredibly rich contact information – more than 400 tiny touch sensors in every square centimeter of skin," said Matei Ciocarlie, who led this work in collaboration with others. "That was the model that pushed us to try and get as much data as possible from our finger. It was critical to be sure all contacts on all sides of the finger were covered – we essentially built a tactile robot finger with no blind spots."