The best way to interact with a device is, yes, you guessed it, our hands. At Google I/O sessions, the company unveiled its approach to gesture control for wearable devices, named Project Soli: a tiny radar-based chip capable of tracking sub-millimeter motions. This interactive sensor can measure the movements and gestures of fingers and translates the resulting signal into commands, allowing the users to control different features as if they were using a conventional touch interface.

The system uses the data gathered from the Doppler image, IG and spectrogram, working at 60 GHz the radar is able to capture motion at a rate of up to 10,000 frames per second. It's much more accurate than any contemporary camera-based systems.
 

 
Is this the next level of gesture control? The user interfaces on the current range of wearable devices are still a bit clumsy and this technology has the potential to solve the problem.
Although it’s still in its early dev stage, we like the look of what we see!
 
 
Stay up to date with the latest news, demos and releases.
Subscribe to Elektor.TV on YouTube!