PORTLAND, Ore.—Hewlett Packard Co. is actively pursuing "sensing as a service" in future applications of its micro-electro-mechanical system (MEMS) expertise, according to an HP business strategist.
Delivering the keynote address at this week's MEMS Executive Congress in Scottsdale, Ariz., Rich Duncombe said MEMS is a unique technology that has been disruptive for 25 years—already revolutionizing multiple industries with no end in sight. HP has been developing MEMS since 1985--mostly for its ink-jet printers—but recently it developing new applications of its MEMS expertise for seismic imaging and infrastructure monitoring.
In HP's core MEMS market—ink-jet printer heads—Duncombe claims that Moore's Law is not slowing down.
"Our MEMS technology for ink-jet printers has been doubling the number of drops per second every 18 months since 1985," said Duncombe. "What other disruptive technology continues to be disruptive after 25 years?"
But now HP is branching out into other MEMS technologies, all based on the same fabrication lines that are used to make its ink-jet nozzles. In particular, HP is supplying Shell Oil with ultra-precise MEMS accelerometers that Shell is using to improve its seismic imaging efforts. Shell is using as many as one million wireless sensor nodes per image. Shell has extensive data processing expertise which is was previously using with full-sized seismic sensors to search for oil, but by adapting those processing capabilities for use with high-resolution MEMS sensors has been straightforward.
MEMS nozzles for thermal ink-jet print (TIJ) heads at HP have been doubling the number of droplets-per-minutefor 25 years, called TIJ Moores Law.
However, the deployment of similar infrastructure sensors for applications such as monitoring bridges, buildings and other assets that can deteriorate over time or be damaged by earthquakes usually is expected to be performed by personnel without the expertise or data centers necessary to interpret the raw sensor data. As a consequence, HP now believes that "sensing as a service" makes sensor for infrastructure monitoring. Thus, instead of just selling the seismic sensors, it is seeking to sell a service that includes both installing the sensors and providing 24/7 monitoring.
"There are so many different ways to gather information about the environment, then take that data and make informed decisions with it," said Duncombe. "MEMS is front and center in a market that is enormous."
Duncombe said that "sensing as a service" is already a $150 billion market today, but will grow by leaps-and-bounds to as much as $350 billion by 2013.
HP's Duncombe told me privately that there will be a learning curve to getting started in "sensors as a service" for all the reasons you state, plus a few others about legal responsibility. The one thing that bothers me, is that it will be hard to tune these systems for "false positives." For less critical applications, you have many examples from which to learn, but how many times does a bridge collapse?
Of course there would be a large consulting component to each "sensing as a service" project. The kind of consulting that has been going on for many years in the design of instrumentation and telemetry systems. The third-party monitoring and interpretation of the data is the new aspect of their strategy. If they are going to coin a new phrase like "Sensing as a Service", however, they should have named it something else so that the acronym would not be confused with Software as a Service. What about "Instrumentation as a Service"?
This kind of comments I have seen thousand times and since long ago. MEMS sounds like HighTech, but offers very few jobs in industy. I have obtained my Ph.D. in this field, however cannot even assure a position like process engineer in any foundry. Actually I'm doing some programming to earn a living. In my company most of the application engineers have background in physics instead of MEMS. According to my opinion MEMS is a big lie for students.
Sensors a service by HP will be bought by many dur to their reputation in the market based on their quality products. This service requires highly talented and experienced man power to analyze the data since the data received from the sensors will be random in nature due to various factors influencing the sensors.To be sucessfull in this venture independent monitoring and study is required for every installation Expensive!!
Agree. 4G should solve many bandwidth problems, but the key is in reducing the cost of sensors and infrastructure mangement to the necessary level to achieve massive economies of scale. Here, killer app(s) must be found. Safety critical apps ain't it in my opinion. Consumer apps will be.
The concept seems great but I am wondering how soon it can be really a profitable business. I would agree with the postings pointing out that there are many obstacles to the remote sensing and transmission of data. I can't help but think that 4G networking could solve much of those data communication problem. The question is cost and reliability. At some point both the cost of using a low bandwidth 4G connection and the meshing of multiple redundant sensors should overcome the current limitations. Who knows what great strides await then? I could see preemptive warning emails/phone calls (automated) prior to a bridge failure preventing a collapse due to failing supporting structures. Really neat idea. My sense is that it will be sooner than we think.
MEMS is just starting and it is not only HP that is playing in this technology. Many firms are actively involved. The future of global sensing needs will be more than 60% in MEMS. I do not even think that HP can play well in this technology as a leader. I see some startups with new ideas that merge with nanotechnology.
DrQuine: you bring out good points about the needed infrastructure to sustain all that data from sensors and interpret it properly. Not sure where the numbers for the potential market come from; they seem exceedingly optimistic to me.
"Sensing as a service" sounds like a great idea - but the sensor seems to be the tip of the iceberg. A lot more infrastructure is required for power, data processing, and connection to the web. Will multiple sensors on a bridge link together to a common wireless transmitter or will a mesh of low power transmitters relay the data to the cloud? Processes will also need to be developed to address the inevitable connectivity failures - do they represent a bridge collapse or salt in the power supply?
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.