New research by the University of Bristolhas shown future wearable devices, such as smartwatches, could use ultrasound imagingto sense hand gestures.
Computers are growing in number and wearablecomputers, such as smartwatches, are gaining popularity.
Devices around the home, such as WiFi lightbulbs and smart thermostats, are also on the increase.
However, current technology limits the capabilityto interact with these devices.
Hand gestures have been suggested as an intuitiveand easy way of interacting with and controlling smart devices in different surroundings.
For instance, a gesture could be used to dimthe lights in the living room, or to open or close a window.
Hand gesture recognition can be achieved inmany ways, but the placement of a sensor is a major restriction and often rules out certaintechniques.
However, with smartwatches becoming the leadingwearable device this allows sensors to be put in the watch to sense hand movement.
The research team propose ultrasonic imagingof the forearm could be used to recognise hand gestures.
Ultrasonic imaging is already used in medicine,such as pregnancy scans along with muscle and tendon movement, and the researchers sawthe potential for this to be used as a way of understanding hand movement.
The team used image processing algorithmsand machine learning to classify muscle movement as gestures.
The researchers also carried out a user studyto find the best sensor placement for this technique.
The team's findings showed a very high recognitionaccuracy, and importantly this sensing method worked well at the wrist, which is ideal asit allows future wearable devices, such as smartwatches, to combine this ultrasonic techniqueto sense gestures.
The Researchers named their Hand Gesture recognitionsystem as EchoFlex.
Source: Youtube
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.