"This allows the satellites to do a better job of flying around on the space station and understanding where exactly they are," Terry Fong, director of the Ames Intelligent Robotics Group, said in the February NASA announcement.
Google talks about Project Tango and its 3D sensibility in a YouTube video. NASA demonstrates SPHERES' stuff and talks about Tango in its own video available here:
According to an iFixit teardown of a Project Tango phone, Tango's most surprising 3D hardware comes from a hot new Capri PS1200 SoC 3D imaging chip built by Apple's PrimeSense. This chip has not yet shown up in any Apple hardware.
It also has a top-of-the-line Qualcomm 8974 Snapdragon 800 processor, 2GB of Elpida LPDDR3 RAM, two Myriad 1 image co-processors, and two AMIC A25LO16 16Mbit low-voltage flash memory ICs. MEMS motion tracking comes from an InvenSense MPU-9150 nine-axis gyroscope-accelerometer-compass. An OmniVision CameraChip with a fisheye lens provides a 180-degree field of vision. A second Omnivision sensor that sees in both the visible and infrared spectrums to help judge depth, and a front-facing camera offers a 120-degree field of vision. The device ponders all that vision using 16 Mbit of SP1 flash memory from Winbond and 64 GB of SanDisk iNAND flash memory.
The iFixit crew called Project Tango "exceptionally cool" and gave it a Reparability Score of 9 out of 10, though that review seem slightly less giddy than NASA's assessment.
"This is no ordinary upgrade -- we've customized cutting-edge commercial technologies to help us answer questions like: How can robots help humans live and work in space?" Fong said in the February release. "Building on our experience in controlling robots on the space station, one day we'll be able to apply what we've learned and have humans and robots working together everywhere from Earth's orbit, to the moon, asteroids and Mars."
Tango and SPHERES will go on several test flights starting next month and will go to the ISS late this year, according to NASA. Fong said he hopes Tango's comparatively better ability to sense and keep track of itself in three dimensions can help the robots fly themselves well enough to handle inspections or repairs outside the ISS hull with only minimal instructions from controllers inside or on the ground.
A robot that can fly well enough to hold itself in position while waiting for a command to cross light-seconds of distance from a controller on the ground might be able to work relatively unassisted outside the space station. A robot that needed real-time piloting just to keep from drifting too far away to return under its own power would be far less useful. The exterior surface of a space station may be only inches away from the interior space, but it's a lot closer to the ends of the universe for a bowling ball-sized robot with the brain of a cellphone and two tiny bottles of CO2 to help get it back home.