Lenovo and Google announced a partnership today to create a new smartphone that makes use of 3D augmented reality technology known as Project Tango. Johnny Lee, a member of the Tango team at Google, said that the sense of space and motion from Tango will now be built into a mobile phone. “We can use the room around us to play games, and hide behind the furniture,” Lee said. Larry Yang, a former member of Microsoft’s Xbox 360 hardware team, gave me a demo of Project Tango recently over at Google. Yang is the lead project manager for Project Tango, which equips a mobile device with 3D-sensing capabilities so that you can create games and other apps that are aware of their surroundings. He showed me a game that uses Tango, and it was like stepping into a future where games, 3D sensing, motion-sensing, cameras, and animated overlays all combine into a very cool augmented reality experience. Project Tango is yet another way to create the feeling of AR, where you insert animated objects into a real 3D space that you can view via special glasses or a tablet’s screen. In our demo, we used a special tablet, but you could also use something like upcoming AR glasses. Augmented reality is expected to become a $120 billion market by 2020, according to tech advisor Digi-Capital. But first, companies such as Google have to build the platforms that make it possible. Google is demoing the technology this week at the 2016 International CES, the big tech trade show in Las Vegas. With Tango’s technology, the mobile device can use the sensors to detect the physical space around it, and then it can insert the animations into that 3D space. So you can hunt through your own home for those killer robots and shoot them with your smart device. Tango taps technologies such as computer vision, image processing, and special vision sensors. When I arrived at Google’s headquarters in Mountain View, Calif., Yang greeted me with a tablet that had Tango running on it. I asked to go to the restroom. He held the tablet out in front of him and ran a query for the restroom. The tablet screen showed an image of the scene in front of Yang, but it included an animated green line that led the way through the building to the restroom. Yang held the tablet out in front of him and followed the green light through the corridors to the restroom. The tablet knew exactly where Yang was and which direction he was facing, thanks to the motion-tracking sensors. And once Yang mapped out the Google building’s interior, Tango remembered the layout. Project Tango devices can use visual cues to help recognize the world around them. They can self-correct errors in motion tracking and become reoriented in areas they’ve seen before.