Last year, Google gave us a taste of Project Soli, an effort to deliver radar-powered finger gesture control to wearables. Today, we got a closer look at how it could be implemented in real products, starting with a customized LG Urbane smartwatch. Google reps were able to control the watch simply by holding their fingers in front of it. As they moved closer, even more options opened up. As they moved away, the standard watch face returned. It’s just a demo right now, but Soli could solve the problem of controlling smart devices with tiny screens (or with no screens at all).
“We’ve developed a vision where the hand is the only controller you need,” said Ivan Poupyrev, a technical program lead at Google’s Advanced Technology and Projects (ATAP) group. “One moment it’s a virtual dial, or slider, or a button.” Basically, Google is trying to create a whole new gesture language for every device in your home.
After rolling out a Project Soli alpha developer kit last year, Google selected 60 developers from a pool of 180 applicants to show off how their implementations. One group used Soli to used it to identify materials like copper, while another used it for 3D imaging. The coolest experiment, though, was using it as an in-car remote control. Imagine using gesture controls in your car by just having to raise your fingers from the steering wheel a bit.
While Google’s initial Soli dev kit worked, it was a bit of a pain. Lawyers made the Soli group put a warning on the back of the kits, because they drew an insane amount of power. They also had to be connected to powerful desktops to work. Over the past year, Google set about refining Soli’s design so it can be implemented anywhere.
“If you can make something work on a smartwatch, you can make it run any way you want,” Poupyrev said.
Together with Infineon, Google reduced Soli’s power consumption 22x, from 1.2 watts to 0.054 watts. They were able to make it run on standard mobile chips like Qualcomm’s Snapdragon 400 and a recent Intel Atom chip by optimizing their code by 256x. And despite those tweaks, Soli still managed an 18,000 FPS radar rate.
Google initially had the Soli chip on the watchband of the LG Urbane, but that still looked pretty clunky. With some help from LG, they were able to fit it into a small area right below its screen. What’s really intriguing is the fine amount of control Soli’s gestures were able to detect — it can even tell when you’re rubbing your fingers together.
The Soli team also implemented its technology in a JBL speaker, which recognized larger finger and hand gestures. It lit up as someone’s hand drew near, and it was able to skip tracks with a simple thumbs-up gesture. It’s an example of how Soli could be used to control smart home devices from afar, without touching them.
So what’s next for Soli? We can expect to see more experimental implementations over time. And next year, Google will roll out a beta version of the Soli dev kit, which looks significantly smaller than what devs have today. It’ll be a while until this technology reaches typical consumer products, but Google’s progress over the past year is impressive all the same.
For all the latest news and updates from Google I/O 2016, follow along here.
Source: Project Soli