BioSleeve Lets The User Control A Robot Via Gestures (+VIDEO)
Jet Propulsion Laboratory has created a new gesture-based human interface, which uses a number of EMG sensors, IMUs, and magnetometers to decode hand and arm gestures and map them to an intuitive robot control system. The BioSleeve looks like an elastic bandage that covers most of your forearm and includes 16 dry contact sensors, which can detect movements of the muscles in your arm, so you can make gestures and have a robot respond to them. The system is much like existing gesture recognition systems, though the BioSleeve doesn’t depend on vision or having your hand in close proximity to a sensor. A great advantage of using EMG is that signals are interrelated to muscle force, so if clenching your fist signals a robot to drive forward, clenching it harder will make the robot drive faster. The BioSleeve can differentiate between the full set of gestures with 96.6 percent accuracy. It will be embedded into wearable garments, put on as part of daily clothes. Due to the low power consumption per channel the system can operate for long periods of time on in-sleeve batteries. Gesture-Based Robot Control with Variable Autonomy from the JPL BioSleeve was presented last week at the IEEE International Conference on Robotics and Automation (ICRA) in Karlsruhe, Germany.
Donate ETH: 0x981FcEAAa895C6cee76D2876e9AfC649Dc0C4c75