I love the film idea and possibilities, but given the amount of information to be processed in real-time (blending 36x video streams!!), I bet that the only feasible way of doing this for a low/medium volume product is by using a high-end FPGA.
Yes, that Portland company was Dodeca. Bought out by Immersive Media. We did the real-time dsp hw and sw. The first StreetView city was Portland for that reason! Parallex is the real problem to correct. Certainly more cameras make this problem easier to solve but more videos to handle make it harder. Not as big a problem further away and then the problem becomes more sensor and lens resolution. Small lenses tend to be the limiting factor to image quality.
NASA's Orion Flight Software Production Systems Manager Darrel G. Raines joins Planet Analog Editor Steve Taranovich and Embedded.com Editor Max Maxfield to talk about embedded flight software used in Orion Spacecraft, part of NASA's Mars mission. Live radio show and live chat. Get your questions ready.
Brought to you by