The amount of digital data around the world is doubling every two years, thanks in large part to innovative new ways to create and share information. Mobile devices, social networking, and the internet will contribute to a data glut of historic proportions by 2020, which according to market researcher IDC, will be 50 times greater than current levels.
The question is, are we ready for all that data? Better yet, are our systems equipped with the innovative technology to manage it?
What's needed is the technology to harness that data to allow both businesses and consumers to make decisions based on quality analysis, rather than experience and intuition. In doing so, we would assume a more scientific approach to our businesses and lives, whether it's a doctor diagnosing a patient, a homeowner choosing the most efficient time to do laundry, or a meteorologist predicting a hurricane.
To this end, we need to build information technology (IT) systems that can not only filter and store all this big data, but also make use of it. For more than 50 years, we've been operating with the same IT elements: processor, memory, storage, database, and programs. And we've designed IT systems to handle business processes automation, long business cycles, and terabytes of largely structured data.
But that's not going to cut it any more. Data is only getting bigger and the only way technology will keep up is if computing gets smarter. Systems designed for transaction processing and structured data can't deliver the levels of performance both business and consumers are demanding and will require in the very near future.
It's time to significantly shift the computing paradigm—from computers that calculate to computers that learn from and adapt to all data, structured and unstructured data, such as emails, presentations, videos, etc.
Last year, IBM's Watson high-powered question/answer system showed the world what was possible when a finely-tuned learning system tackles big data with advanced analytics when it competed and bested two people on the Jeopardy! game show.
Today, IBM and partners are putting Watson to work in industries from healthcare to banking. Watson has given us a glimpse into the monumental shift in computing that will affect businesses in every industry and consumers around the world. But that's just the beginning.
Future generations of optimized systems will benefit enterprises across industries as they deal with common and complex data center issues. We are on the verge of expert integrated systems with built-in knowledge on how to perform complex tasks and based on proven best practices. A system that not only recognizes changes in the computing environment, but also anticipates them. As workload demands spike, the systems respond. When new applications or upgrades are needed, they're deployed against best practices and integrated patterns.
In order to deal with the explosive growth of data, the systems that store it will have to get very efficient and smart. They will do this through deployment of advanced capabilities including universal storage virtualization, compression, data de-duplication, as well as automated tiering to keep the data best balanced for cost, speed and access patterns. In addition, next generation integration technologies enable systems of integrated storage, networking and servers to make these capabilities easier to deploy.
As we rush to the future, generating, storing, and managing ever more mountains of digital information along the way, the time to start questioning the vitality of our systems is now. And if you wondered if they could get any smarter the answer is simple: Yes. Dr. Rao is IBM Fellow and VP, Systems & Technology Group Development, India & Southeast Asia.
In my opinion at some point in time the Cloud model of storing the data may have to be reversed.
Instead of storing huge amounts of data centrally we need to do it other way. Data storage should be personalized and a high bandwidth network should be able to access, collate and process data from such personalized data centers.
This is a very interesting topic. It looks like its going to take a considerably amount of research to develop new algorithms to ensure the correct storage of data and the storage of the correct data. Let's not forget that currently Bluetooth Low Energy is making up specifications for some Medical profiles like weight scale and pulse oximeter. Extrapolating to see what this will result makes me think that a lot of medical data will be created and will have to be stored also. That's another big chunk of data.
To keep the amount of data stored onto the servers , globally, we need intelligent algorithms which will automatically filter out and delete duplicate data, data unused over a period of time and so on.
Dr. Rao did not mention the coming explosion of data from Internet-of-Things (IoT) enabled by sensor networks. When data from billions of sensor tags is going to be on the cloud, the problem stated by Dr. Rao has potential to increase by many orders of magnitude!
To give sampling of data explosion, a home energy management solution (monitoring connected outlets as well as lighting) will typically put out 100Mb/Day, that too with intermittent monitoring. The Brazil-Tag (automobile e-license plate) which is only available in a couple of cities of that country puts out ~8.6Gb/Day. A typical cosmopolitan air quality monitoring system writes out 112Mb/Day... well you get the picture.
The challenges of analytics, intelligence, prognostics, etc., from Sensor Networks was discussed yesterday in a SensorCon panel that I took part yesterday:
Data available and the new data s flowing in are remarkably very high and fast as Mr.Guru mentioned here.I feel that the processing speed and storage technology also simultaneously gearing up and always it will meet the demands. Just compare the technology from 1992 to 2012 in the computer field. Intel P series to the present multi core processors and the HDD's,memory.There is about 50 times improvement over twenty years. The optical interconnect system between the processor and the peripheral components is a great hope that by 8 years the same 50 times growth can happen which had happened in the past 20 years.
Join our online Radio Show on Friday 11th July starting at 2:00pm Eastern, when EETimes editor of all things fun and interesting, Max Maxfield, and embedded systems expert, Jack Ganssle, will debate as to just what is, and is not, and embedded system.