So it's open this and open that, and maybe the door is open a wee bit too much? What do you think?
I fully agree that open standards are the dream by implementors and that they help the manufacturer and user to lower cost.
But I disagree that an open standard means to open up volunerablities. The protocols should be open and standartized but the network has to be closed and private.
No standard suggest to connect ciritcal systems directely (unprotected) to the internet. Everyone knows that this is an open door and that it is a stupid idea, but it happens way to often. In my opinion it's mainly a problem of knowledge and training.
Hackers are able to attack fieldbus systems too. But there is very low interest in it because factories don't have fieldbus plugs on the outside of there walls. We have to implement the networks and gateways in a way to keep it like that.
If I turn the argument inside-out, the flaw in the reasoning becomes apparent: if I'm considering designing a new kind of OS, will hiding the documentation of how it works guarantee (or at least enhance) its security? Of course not, for example the CSS encryption standard for DVD authoring was "broken" by Jon Lech Johansen of Norway while he was still a teenager, and of course this was with absolutely NO public documentation of how it worked. In fact "open source" encryption software is actually PREFERRED for its robustness (since many different people get the opportunity to review the implementation and try to "poke holes" in it and offer different strategies for possible exploits that need to be protected against). The security strategy properly needs to be integrated mostly into the communications regimen (with only the usual precautions against buffer overruns and similar "hygeine" observed in the operating code). With the "traditional" internet we kind of "backed into" enhancing our security by putting NAT into our routers (which was really mostly a workaround for the address shortcomings of IPv4); it's conceivable we could come up with an IoT implementation without this explicit mechanism (to avoid router power and cost issues obviously), but I'd suggest we ought to proceed cautiously if we do so, because alternative approaches that place more reliance on "tunnelling"-type encryption protocols aren't without cost themselves, and we certainly don't want an IoT implementation which has security issues "baked in" from the get-go.
This article does not well separate the idea of open standards and open systems.
Open standards, as opposed to proprietary, are vetted by a community and have the possibility of higher security because you have a lot of individuals looking over the shoulder of the implementer.
Proprietary standards can be secure, but by their closed development are subject to internal forces that are not as prevalent in open standards. That being development schedule, cost, and less oversight.
Proprietary standards only have an advantage by their obscurity. However, we have seen too many examples of the failure of security by obscurity to rely on it. This pendulum will not return to zero again. Open standards have supplanted obscurity in their advantages and level of security.
Management decisions from afar can unknowingly have significant impact on the compromises chosen during development of proprietary standards. Couple this with less oversight and you have significant pressure to make sloppy implementations.
On the other hand, all open systems, those accessible to outside actors, are subject to nefarious activity. It matters not whether the standards employed are proprietary or open. It does matter how well the standards are implemented and we have already argued that open standards have significant advantages in this respect.
Understand the differences. "Open" can refer to multiple aspects of our machines and not separating these clearly simply muddies the discourse.
While open systems have the advantage of more people testing them, they have some of the same economic and other constraints as the proprietary systems. Most of the standards organizations are populated by people who are representing their employer's interests, meaning the standards sometimes fall far short of the ideal and developing organizations sometimes take shortcuts in both development and testing that have the effect of compromising the intended level of security. No particular approach is a panacea.
Your example is borne out in the real world by NFPA/NEC, CGA, ANSI, ASTM, and others where manufacturers are driving the standards. These standards are owned by profit making organizations and are created by actors that profit by the standard's specifying their products.
Take NEC specification of GFCI and more recently AFCI devices in spite of the lack of a cost benefit for their wide adoption. While these are useful devices, their initial poor implementation and high cost make for a less reliable, more costly product for the consumer and a renewable income stream for the NEMA member companies. Hardly a good example of an open process.
Take, on the other hand, the open source software development community where the entire product is readily available at no cost to all. The most interesting example is the development of public domain encryption software. No single actor could slip a back door into the application without it being revealed to all.
Contrast this with the proprietary encryption implementations of the companies selling encryption software where it was recently revealed that the NSA had directed the inclusion of back doors or inadequate implementations allowing NSA access to the "protected" data. The manufacturers were barred from disclosing the intentionally flawed implementations they were selling. Even better examples come from years past when proprietary encryption implementations were simple enough that only fourth grade math skills were required to decipher the encrypted data.
Your example is just another variation on a proprietary standard, albeit one employed broadly by industries. Broad adoption as a standard by multiple organizations does not make a standard "open." It is just a different variation on private standards.
Even if the software and the protocols are free of bugs and security flaws, opening up the physical communication path to the outside world makes denial of service attacks possible. In a good many applications, mere delay of status reporting and control action can result in anything from lost productivity to a public disaster. Routing the monitoring and control of an essential system through the Internet is plain stupid. A dedicated path, or at least a dedicated synchronous time slot, is necessary to guarantee prompt delivery of essential communication.