How do you improve the touchscreen user experience? According to Microsoft Research, you learn to anticipate the touches before they even happen.
Don’t worry, Microsoft isn’t working on some brain-scanning smartphone — at least not yet. Microsoft’s Ken Hinckley says they’re using “the hands as a window to the mind.” He and his co-workers have been experimenting with something they call “pre-touch sensing,” and it’s a bit like what you may have experienced on a Samsung smartphone with the Air View/Air Gesture feature turned on.
Microsoft’s ten person team has developed a system that tracks your fingers as they hover over a phone’s display. It’s much more than just a way to scroll through lengthy emails and webpages or a way to interact with your phone while you’re wearing a pair of gloves.
One Air View-like feature is playback controls that fade into view when your fingers get close to the screen while a video plays. Microsoft’s controls are adaptive, though. You’ll see a different layout if one-handed use is detected and another if you’re holding your phone with both hands and using your thumbs to interact. To fast forward and rewind playback, for example, pre-touch sensing puts a jog dial right where it figures your thumb is going to land on the display.
It also works a bit like 3D Touch (except that the 3D part takes place in the air over the phone, not on its surface). Hold a finger on one of your downloads and bring a second finger close and a menu pops into view that lets you copy, delete, or rename it.
Wave a finger over a link-heavy web page, and pre-touch sensing will highlight the links as you wave your finger over the screen. It works with video embeds, too.
As for where or when you’ll see Microsoft’s pre-touch sensing wizardry packaged into an app you can download or built in to an OS, that remains to be seen. Microsoft Research has turned out some pretty amazing projects in the past that we never had a chance to play with.