The proliferation of microprocessors has sparked an explosion of net-centric and connected smart devices, or embedded systems, in homes and offices. From systems for control of heating, ventilation, access and security equipment to electrical metering devices, process control systems, on-site backup generators, smart appliances and automated food-processing equipment, today's facilities contain a panoply of embedded systems.
While such devices and systems provide great utility in their specific areas of application, they pose a challenge in today's increasingly knowledge-based economy: They do not always communicate with each other, with enterprise systems or with the Internet.
Embedded systems have been developed largely without the benefit of accepted standards. They employ a dizzying array of communications protocols, data formats and software platforms. The result is that the world of embedded systems is highly fragmented, not interoperable and more complex and expensive than necessary. Embedded systems simply have not been conceived with the Internet, interoperability or integration with the enterprise in mind.
While it is true that more recent systems follow some of the emerging standards, that has only exacerbated the problem because the standards themselves have not been designed to interoperate. Indeed, the push toward standard protocols has only created more "languages" that need to be integrated. And because devices continue to get smaller, less expensive and more focused on a single application, we cannot expect to see the devices themselves solve the problem by speaking many languages simultaneously.
Multitude of standards
History has demonstrated that interoperability will not be accomplished through the use of any single "standard"; too many such standards have already built viable followings. The ability to move to new standards is also problematic for the owner. Economics do not support wholesale replacement of existing, functioning systems. It is simply not practical to replace all existing devices with ones that speak a new standard, no matter how compelling the potential benefits.
What we are left with is the need to integrate the myriad of smart devices without affecting the devices themselves-a solution that embraces the multitude of new standards and the wealth of legacy systems equally.
The desktop computer revolution created a standardizing force that allowed software developers and manufacturers to focus on a single platform and set of technologies. It provided a standard methodology for interfacing to printers, networks, storage systems and so on. The standard platform had the effect of isolating (or freeing) application developers from the details of how the devices worked, thereby allowing them to focus on their applications. The result was dramatic acceleration of application development in the software market.
This fundamental step has not occurred in the world of embedded systems and smart devices. There is no standard platform on which application developers can build. Every software application has to be written to deal with the vagaries of any system with which it will be used. That puts a tremendous financial burden on developers and has limited the range of independent software applications that can be used with embedded systems.
What is needed is a solution that brings together all embedded devices-old, new, standard or otherwise-into a single environment that shields the user (and software developer) from the distinctions among systems. That is the role of an automation infrastruture.
The infrastructure is a layer of software that resides above the individual systems and their specific protocols. Its role is to provide a uniform method of accessing data from the various smart devices and issuing commands to those devices. The infrastructure must be protocol-agnostic and vendor-neutral.
The infrastructure enables seamless integration of the diverse systems by introducing a new factor into the equation: a common object model. Simply put, the infrastructure takes the data elements from all of the various devices-inputs, outputs, set points, schedules and control parameters-and morphs those items into virtual objects. That creates a virtual model of the actual systems-a model that supports all of the functions and features of the end devices. The result is a uniform, normalized database of objects with which the user or application developer can work.
The framework takes in heterogeneous data from different systems and creates standardized software objects that represent them. The virtual objects are fully interoperable. On top of that object database, the infrastructure provides a set of general services, such as a real-time control engine, scheduling, alarm functions and Internet connectivity.
The common object model, which has access to all of the data and actions supported by the diverse systems, can now serve as a foundation for other software applications that provide value to the operation of the facility. Examples include real-time energy data collection and analysis, as well as execution of global control strategies, such as schedules and alarming.
With those common objects, one can build browser-based displays, reports, alarms and even supervisory control logic that will work across the multiple systems. The result is true interoperability without the need for users to get mired in the details of competing protocols and without the need to disturb the installed systems.