Touch-sensitive electronic skin for robots is a project under development by a number of groups. Cornell University reports that scientists there are pursuing a simpler approach, however, using shadow-imaging cameras to let robots know when they’re being touched. Known as ShadowSense, the experimental system incorporates a USB-powered laptop-connected camera, located beneath a non-electronic translucent “skin” on a soft-bodied robot.
As a person reaches toward the robot, the ambient lighting casts a shadow of their hand onto the skin. The camera tracks that shadow from the other side of the skin (within the robot), utilizing machine learning-based algorithms to determine when the hand is actually touching the skin, which area of the skin it’s touching, and what gesture it’s making. In this way, not only can ShadowSense tell when and where the robot is being touched, but it can also assign different commands to different touch gestures.
The current prototype robot – which is mainly just an inflatable bladder of nylon skin stretched around a cylindrical wheeled skeleton – is capable of differentiating between touching with a palm, punching, touching with two hands, hugging, pointing and not touching at all, with an accuracy of 87.5 to 96 percent, depending on the strength and direction of the lighting.
The researchers are quick to point out that the applications of the technology aren’t limited to robotics, as it could also be used in touchscreen displays or electronic appliances. That said, ShadowSense currently still does have some limitations – not only is a light source required, but the camera also has to be located within line of sight of the interactive part of the skin. The use of mirrors or additional lenses could help address the latter.
“Touch is such an important mode of communication for most organisms, but it has been virtually absent from human-robot interaction,” says the lead scientist, Assoc. Prof. Guy Hoffman. “One of the reasons is that full-body touch used to require a massive number of sensors and was therefore not practical to implement. This research offers a low-cost alternative.”
The research is described in a recently published paper in the journal Proceedings of the Association for Computing Machinery on Interactive, Mobile, Wearable and Ubiquitous Technologies.