Swiping your phone’s touchscreen might disappear just as quickly as it emerged, if Google have their way. When their new technology hits the shelves, you won’t even have to touch a screen ever again. Here’s why.

It’s called Project Soli, and it uses radar waves to detect precise finger movements – or as they call them, “micromotions”. The technology would allow detection of extremely fine movements, but also ignore other, irrelevant gestures (such as insects, for example). The project is the work of the Advanced Technology and Projects lab, where Google works on all their futuristic projects, and the prototypes already seem promising.

In one of the demos, Ivan Poupyrev changed the hours on a clock by simply turning an imaginary dial, and then changed the minutes by moving his hand a little higher and doing the same motion. He then proceeded to kick a virtual football by flicking at the screen.

Soli is fundamentally different from current technologies; unlike conventional capacitive screens, it can detect 3D motion, and unlike other motion detection technologies like Kinect, it is tuned to detect fine movements. The key is the high-frequency radar (60 Hz), which enables a high enough resolution to detect these fine movements. The transmitter also doesn’t scan with a fine beam of radio waves and uses a wide cone spread instead, which helps keep the costs down. The current proposal includes two transmitters and four receivers, which I can only assume filter out the noise and the unwanted movements. For example, if for a certain action you have to conduct a certain movement with 2 fingers, you don’t want your other fingers’ movement interfere.

It took Google only 10 months of work to shrink this array down into a fingernail-sized chip – small enough that it could be integrated into electronic devices, especially smartphones. A particular interest was given to smartwatches, which seem to be the new gadget everyone wants to get their hands on nowadays. Smartwatches are an interesting example, because they highlight a physical problem of smart devices: as they become smaller and smaller, it becomes harder and harder to actually use the touchscreen.

The possibilities for this technology are virtually endless. Enabling people to interact with their smart devices is only the first step – mixing virtual reality and the ability to remotely control devices seems like the logical step. So far, Google hasn’t announced if it will release this technology itself, or if it will emerge as a stand-alone project for other companies to integrate. We’re eagerly waiting for more details.

Here’s a video showing how Soli works: