sábado, 14 de octubre de 2017

Using Ultrasound Imaging for Gesture Control with EchoFlex

eferemail
shared this article with you from Inoreader

Most gesture control systems we come across work by either physically measuring the movement of the hand (like with sensors in a glove), or by tracking that movement with cameras and computer vision. Both have fairly obvious drawbacks: wearing sensors on your hand is uncomfortable and conspicuous, and computer vision tracking requires that a camera be mounted somewhere. But, a new proposed technology called EchoFlex might make all of that unnecessary.

New research has shown future wearable devices, such as smartwatches, could use ultrasound imaging to sense hand gestures. (📷: Bristol Interaction Group)

EchoFlex, which is mostly just a concept at this point, would use ultrasound imaging to detect the movement of muscles and tendons in your arm. This would, theoretically, allow gestures to be registered with just a band worn on your arm (like a smartwatch). As the research team from the University of Bristol's Bristol Interaction Group demonstrates, muscle and tendon movement can clearly be seen with ultrasound imaging.