New Robot Can Touch and Feel Like Humans Do

Brain Controlled Prosthetic Arm

Robots that can grasp and achieve tactile sensing are generally bulky and rigid, as they achieve these functions by using motors. A group from Cornell University has created a way for a soft robot to feel its environment internally, much the same way as humans do.

Principal investigator of Organic Robotics Lab, Robert Shepherd is an assistant professor of mechanical and aerospace engineering. He led a group that published a paper describing how stretchable optical waveguides can be used to act as force, elongation and curvature sensors in a soft robotic hand.

An article titled “Optoelectronically Innervated Soft Prosthetic Hand via Stretchable Optical Waveguides” featured in the debut edition of Science Robotics and lists doctoral student Huichan Zhao as the lead author.

Zhao explained that while most robots that detect things from the surface have sensors on the outside of the body, their sensors are integrated within the body. The soft robot hand detects forces being transmitted through the thickness of the robot’s ‘skin’. He compares this to what humans and all organisms do when we for example feel pain.

(A) Schematic of hand structure and its components; (B) Showing of the fabricated hand mounted on a robotic arm with each finger actuated at ΔP = 100 kPa. (Image Credit: Cornell University)

Optical waveguides have actually been in use for many sensing functions including acoustic, position and tactile since the early 1970s. In the early days, manufacturing was a complicated process, but this has changed over the last 20 years with the introduction of 3-D printing and soft lithography. This resulted in elastomeric sensors being developed and produced, and these can easily be incorporated into a soft robotic application.

The group used a four-step soft lithography process to produce the core through which light propagates. The same process is used for the cladding that forms the outer surface of the waveguide and houses the photodiode and the LED (light-emitting diode). More light is lost through the core the more the prosthetic hand deforms. The variable loss of light is detected by the photodiode, thus allowing the prosthesis to “sense” its surroundings.

The group demonstrated using the optoelectronic prosthesis to perform a variety of tasks. These included grasping and probing for both texture and shape. The hand was also able to scan three tomatoes and determine which was the ripest by determining the softness.