In my opinion at some point in time the Cloud model of storing the data may have to be reversed.
Instead of storing huge amounts of data centrally we need to do it other way. Data storage should be personalized and a high bandwidth network should be able to access, collate and process data from such personalized data centers.
This is a very interesting topic. It looks like its going to take a considerably amount of research to develop new algorithms to ensure the correct storage of data and the storage of the correct data. Let's not forget that currently Bluetooth Low Energy is making up specifications for some Medical profiles like weight scale and pulse oximeter. Extrapolating to see what this will result makes me think that a lot of medical data will be created and will have to be stored also. That's another big chunk of data.
To keep the amount of data stored onto the servers , globally, we need intelligent algorithms which will automatically filter out and delete duplicate data, data unused over a period of time and so on.
Dr. Rao did not mention the coming explosion of data from Internet-of-Things (IoT) enabled by sensor networks. When data from billions of sensor tags is going to be on the cloud, the problem stated by Dr. Rao has potential to increase by many orders of magnitude!
To give sampling of data explosion, a home energy management solution (monitoring connected outlets as well as lighting) will typically put out 100Mb/Day, that too with intermittent monitoring. The Brazil-Tag (automobile e-license plate) which is only available in a couple of cities of that country puts out ~8.6Gb/Day. A typical cosmopolitan air quality monitoring system writes out 112Mb/Day... well you get the picture.
The challenges of analytics, intelligence, prognostics, etc., from Sensor Networks was discussed yesterday in a SensorCon panel that I took part yesterday:
Data available and the new data s flowing in are remarkably very high and fast as Mr.Guru mentioned here.I feel that the processing speed and storage technology also simultaneously gearing up and always it will meet the demands. Just compare the technology from 1992 to 2012 in the computer field. Intel P series to the present multi core processors and the HDD's,memory.There is about 50 times improvement over twenty years. The optical interconnect system between the processor and the peripheral components is a great hope that by 8 years the same 50 times growth can happen which had happened in the past 20 years.
NASA's Orion Flight Software Production Systems Manager Darrel G. Raines joins Planet Analog Editor Steve Taranovich and Embedded.com Editor Max Maxfield to talk about embedded flight software used on the Mars on EE Times Radio. Live radio show and live chat. Get your questions ready.