There was a start-up in Portland, Oregon, near where I live, back in the late 90's or earl 00's, that built cameras like that. The hardware really isn't that new. The real magic in one of these is the lenses and the software to stich it all together.
With 36 cameras, eash one can have a narrower field of view than systems with fewer cameras. That would allow for less distortion on the seams.
I don't know how much post-processing is required and how much computing power or time is required. Those would be good bits to know.
The Immersive Media product also had a real-time image processor linked to VR goggles and a head-mounted tracker so you could, -in real time- look around in the scene as if you were standing wherever the camera was - any direction at all. And all the stitching/blending algorithms running on real-time dsps made it seamless and immersive.
The camera used for Google Streetview is already a 11-camera unit mounted in a Dodecahedron pattern (12 cameras, but one pointing straight down into the mounting mast is not too useful). That technology is at least 12+ years old. Check out http://immersivemedia.com/
I've seen two companies, Finwe & Kolor, collaborate to bring a similar solution using 6 to 12 GoPro cameras, and using VRase virtual reality kit + smartphone to play it back. Probably not as neat as the solution above, but possibly more cost effective.
About the only thing I remember from my trip to Disney World / Epcot was the 360 movie in the Mongolia theater. It was only a horizontal 360, but was quite impressive. Adding in 360 x 360 would be pretty incredible.
NASA's Orion Flight Software Production Systems Manager Darrel G. Raines joins Planet Analog Editor Steve Taranovich and Embedded.com Editor Max Maxfield to talk about embedded flight software used on the Mars on EE Times Radio. Live radio show and live chat. Get your questions ready.