In the word of consumer electronics, smartwatches are a popular commodity. However, smartwatches of the future may sense the world a little differently than today’s gadgets. Researchers from the University of Bristol believe next-gen wearables could sense hand gestures using ultrasonic imaging.
Hand gestures are an easy way to interact with smart devices. In the instance of WiFi lightbulbs, users can dim their living room lights with a simple flick of the wrist. Since motion recognition relies on sensors, their placement often leads to certain device restrictions.
The team believes that sensor-filled smartwatches can employ ultrasonic imaging of the forearm to improve hand gesture recognition. This technique is already in use within the medical field, including pregnancy, muscle, and tendon scans.
Transitioning this existing tech into the wearable market, the team used machine learning and image processing algorithms to connect muscle movement with certain gestures. To choose the ideal sensor placement, the researchers submitted a user study.
After a series of tests, results showed increased accuracy in recognition.
“With current technologies, there are many practical issues that prevent a small, portable ultrasonic imaging sensor integrated into a smartwatch. Nevertheless, our research is a first step towards what could be the most accurate method for detecting hand gestures in smartwatches,” says Jess McIntosh, PhD student in the Department of Computer Science and Bristol Interaction Group (BIG).
McIntosh joins Professor Mike Fraser and Asier Marzo as lead project researchers from BIG at the University of Bristol, along with University Hospitals Bristol NHS Foundation Trust (UH Bristol).