The Robotic Age
|Back to Society|
|Julien Happich||January 3rd 2017|
Though, reliable sensor implementation can be difficult on flexible extremities such as finger-like grips. This is what a team of researchers from Cornell University (Ithaca, NY) proposed to address with a novel type of soft waveguide-based light sensors readily embedded into deformable grips.
Their paper "Optoelectronically innervated soft prosthetic hand via stretchable optical waveguides" published in Science Robotics details the fabrication and operation of chemically inert stretchable and flexible optical waveguides made up of an optically transparent core (2dB/cm propagation loss at 860nm) clad into a light blocking elastomer. Once fitted with a LED on one end and a photodiode on the other end, these elastomeric optical waveguides can be monitored for any deformation (stretching, bending, compression) affecting light propagation.
Fabricated using cheap custom molds obtained through 3D printing, the elastomeric optical waveguides unveiled in this paper had an overall square profile of 3mm by side, with an inner core 1mm wide. Several of them could be accommodated within the fingers of a pneumatically-actuated soft prosthetic hand to actually try their sensing capabilities in a real application context.
While typically on traditional mechanical hands, proprioceptive sensing is performed through motor motion encoders combined with bulky and rigid multiaxial force/torque load cells, here only one type of continuous flexible sensor was effectively innervating the soft prosthetic hand.
In their soft robotic hand, each finger featured three waveguides bent into a U-shape to detect axial strain throughout the finger. Fitted with a stiff plate in a neutral bending plane where there is no axial strain, one of the waveguides also served as a touch sensor at the fingertip.
By design, because the original waveguide mold's surface roughness depends on the 3D printing resolution, the waveguide's optical transmission properties are anisotropic. This is due to the "top" of the waveguide core interface being atomically smooth while the "bottom" core interface has an average roughness of 6nm due to demolding. This anisotropy means that the signal output depends on the direction of bending (up or down). Such a signal propagation anisotropy could also be designed on purpose for side-to-side bending detection.
The researchers were able to use these waveguide-based optoelectronic sensors to detect curvature, elongation, and force applied to the separate elastomeric silicone fingers.
Analyzing the optical data (light losses upon waveguide deformation), they showed that such a soft prosthetic hand could distinguish curves as small as 5m−1 and roughness on the order of 0.1mm. As a demonstration, using a scanning motion with the finger dragging across simple objects such as a computer mouse, they were able to reconstruct the mouse's shape including the scroll wheel and the click of the mouse purely out of the optical data.
The hand could not only perform shape and texture detection, it was also tested to detect the softness of various test objects, based on the strain and stress analysis of the three waveguides.
The paper concludes that although the soft prosthetic hand was only a research prototype, it highlighted the versatility of the soft optical waveguides implemented as sensors. What's more, because the waveguide sensors and the body of the actuator share the same material library (silicones, elastomers etc…) more sensors could be incorporated into the actuators or even replace the body of the actuators, for higher sensor density. Sensitivity could also be increased by using a larger power range from the LED (from the baseline power to ambient light power) and by increasing the pressure range of the soft actuators to press on objects with more force.
The researchers noted that although the sensors were built in different places within the finger actuator, they still observed signal coupling. They expect that by incorporating more sensors in order to extract denser information, the output signals would be increasingly coupled, but they anticipate that because the outputs of waveguide sensors are precise and repeatable, machine-learning techniques could be used to map inputs to outputs or to perform more subtle object recognition through the collection of large quantities of data.