There are no active ads.

Advertisement

Google Project Soli to change wearables forever with gesture radar

by Todd Haselton | May 29, 2015May 29, 2015 10:00 am PDT

Google’s head of advanced technologies and projects (ATAP) Regina Dugan took the stage during the second day of Google I/O 2015 to introduce some of the new technologies her team has been working on since last June.

Dugan briefly discussed the issues humans face interacting with wearables, such as smartwatches, that might be too small for accurate touch interaction. Two projects Google ATAP is working on to address those problems include Project Soli and Project Jacquard, the latter of which focuses on creating larger surface area in wearables with which we can interact.

Project Soli is a gesture radar solution, Ivan Poupyrev, ATAP Technical Project Lead explained. Previously, the best way to capture motion in fingers from the hand in free space, in real time, was to use radar. Except, the problem facing ATAP was that radar dishes were far too large to fit in a smartwatch. So, ATAP created its own mobile gesture radar system in just 10 months, scaling down from something the size of a computer to a small chip that fits in a smartwatch.

Jaime Lien, lead research engineer from Google’s ATAP team and radar expert helped explain a bit how it works for gesture sensing. Using a raw radar signal, the computer can recognize your hand’s shape, size, pose and dynamics, Lien explained. “The sensor can tell if I’m wiggling my fingers, or holding still,” she said.

So what’s it good for? A demonstration by Poupyrev showed how one might hover his or her hand above a smartwatch to interact with an invisible watch crown — an invisible version of the smart crown on the Apple Watch — to interact with software, playing games, adjusting the time and more on a wearable.

The solution would enable you to, for example, interact with your wearable without ever touching it. Gone would be the need to tap the screen with your fingers, trying to open teeny tiny app icons or accessing menus. It’s pretty amazing tech, and while it might be a few years away from implementation, the tools required to use it are coming soon.

The hardware and developer APIs will be available later this year, Poupyrev confirmed.


Todd Haselton

Todd Haselton has been writing professionally since 2006 during his undergraduate days at Lehigh University. He started out as an intern with...

Advertisement

Advertisement

Advertisement