Can you imagine operating your smartphone through hand gestures instead of typing on your keypad?
Well, ETH scientists at Zurick have come up with an app that enables users to do smartphone operation by means of hand gestures. A person needs to hold his phone with one hand while moving the other in the air above the built-in camera, doing gestures that pretty much looks like a sign language.
You may do this by moving your index finger to the right or left, by spreading your fingers out, or mimicking the firing of a pistol. If you spread out your fingers, you are magnifying a map section or scrolling a book page forwards. If you follow the firing of a pistol, you are switching to a new browser tab, changing the view of your map from standard to satellite or shooting down enemies in the game you really enjoy.
We give credit to the innovative use of gestures through a new type of algorithm developed by Jie Song, a master’s student belonging to the working group spearheaded by Professor Otmar Hilliges.
This program employs the built-in camera of the smartphone for environment registration. While it doesn’t evaluate color or depth, it does register the parts of our hands and the shape of our gestures. Reduced to a much simpler outline classified according to stored gestures, it’s followed by the program’s execution of the command associated with the particular gesture it has observed. This program can also recognize the distance of the hand from the camera and gives the user warning if his or her hand is too far or too near.
Currently, the program can recognize six different hand gestures and can execute the corresponding command of each. Albeit researchers have made tests on 16 outlines, it is not its theoretical limit.
As addition of this algorithm employs a only a small part of computer memory, Hilliges says it’s most suitable for usage on smartphones. Because of footprint that requires minimal processing, it’s also ideal for augmented-reality glasses and smart watches.