Google Project Soli will make every object smart

The Internet of Things has already changed our everyday lives: Turning on the lights with a clap of the hands, dictating shopping lists into the air, and letting everyone live out their Sci-Fi dreams. The fantasy has always been to live in a home where all our devices can communicate with each other, and we like to imagine we’re there. Then practicality sets in and you concede that you still have to get up off the couch to pick up the sandwich that you left on the counter.

Maybe we’re too lazy.

The biggest roadblock remains: In order to reach this new technological milestone we’d have to replace our “dumb” objects with far more expensive “smart” ones – a move we’re not ready to make. That’s not even getting into the privacy issues, the inevitable security breaches, the whole nine yards. Google formed Project Soli in search for an answer.

Adjust the volume by miming it.
Simple hand and finger gestures transmit unique signals to the tech.

Project Soli

Project Soli’s focus was to use radar to accurately track hand gestures. It’s a simple idea that can now have far more interesting applications; Soli can detect the typical big motions, but it can also detect minute movements (of less than a millimeter) through obstructions and with extreme accuracy. “Radar has been used for many different things: To track cars, big objects, satellites, and planes,” says Ivan Poupyrev, the founder of Project Soli.

“We’re using the radio frequency spectrum to track micro motions of the human hand, and use that to interact with wearables, the Internet of Things, and other computing devices.”

That was four years ago.

Design Lead Carsten Schwesig says that “now we are at a point where we have the hardware where we can sense these interactions and we can put them to work.”

So how does it work?

Project Soli has designed extensive recognition software that sees gestures through radar. Certain patterns reflect human intent and the new technology can convey those intents to various devices around the house. “Imagine a button between your thumb and index finger,” says Schwesig.

How does one mime "get me a sandwich?"
If you can mime pushing a button, the sensor gets what you’re trying to convey.

“The button is not there, but pressing [as though it was] is a very clear action.” Radar can sense that action and attribute it to specific intentions. The past four years the team has taught the software to recognize a wide variety of physical signals:

How will this build a smart home?

“Soli interaction makes us realize we can interact with computers just using day-to-day objects,” explains Professor Aaron Quigley, Chair of Human Computer Interaction in the School of Computer Science at the University of St Andrews. “If I had this kind of sensor in, for instance, a regular kettle and a cup, it’s possible to detect them [with radar]. You can interact with them, and the computer will understand what you’re doing. Suddenly, every physical object in your home becomes a way to communicate with your computer.”

This could usher in a new level of digital art!
Soli recognizes gestures that make brush strokes on tablet thinner or thicker.

It’s 2019 now and radar chips are tiny, cheap, and low-powered enough to become ubiquitous in today’s market. A recent U.S. Federal Communications Commission waiver has even authorized Project Soli to utilize frequencies between 57 and 64 Ghz – higher frequencies that are commonly permitted in everyday gadgetry. This opens up even more possibilities for what Soli could do, and could mean we’re drawing excitingly close to where this powerful tech is regularly available.

Are you excited for a fully smart home, or do you think the inexorable privacy and security issues will quash any real profit from this excursion? Let us know in the comments below, and stay tuned for more news on Project Soli!

  • Link Copied!

You may also like

View all comments
Loading comments