The facts are that Open Software is generally of a higher quality than proprietary software. From an article just published on 4/16/14 Coverity reported that the defect density for Open Source code was 0.59 while the proprietary code is 0.72.
There were lots of other useful information in the article such as there were 17 times as many defects fixed in proprietary code as in the Open Source code.
So - stop with the soul searching and just get on with it.,
hasenmi hits the nail on the head. If this occurred in proprietary code, we'd never know about it.
The true value of free is not in the dollars spent, but in the openness. When there is a problem it is there for everyone, and being open there are many more eyes looking for the problem. The openness ensures there is rapid pressure to fix the problem.
By comparison, for a proprietary tool such as the Microsoft Crypto API, we have to rely on Microsoft assuring us it does not have such a bug. And just Microsoft engineers looking for bugs. And Microsoft's professionalism in fixing and publicizing any bugs they do find.
Many free and open source projects are very well resourced - think of Linux, Apache and the GNU Compiler Collection. All developed by very large communities of well paid full-time professionals. There is plenty of commercial incentive to resource these projects properly.
But even smaller projects actually look quite good compared to much proprietary software. For example, a lot of proprietary EDA tools are provided by small companies or small departments of larger companies, where the R&D team numbers less than 10, and is doubling up providing second line support. By comparison even small team free software looks pretty good, and you still have the freedom of fixing it yourself - no dependency on the supplier here.
The article is correct that we have to look at the cost of free software. But when we look at all the costs, it often turns out that freedom can deliver much better value than proprietary software.
Open Source Software is quite expensive to produce. Just integrate the man hours that goes into something like the Linux kernel or OpenSSL and then multiply by the loaded overhead of an average programmer. You will be shocked at the amount of time-is-money being contributed to OSS efforts.
Yet each contribution is frequently a point patch. They typically solve a problem the patch submitter has. While it is the "job" of the core committers to rationalize all of the patches and move the technology forward in a coherent fashion, there is both little reward for doing so and sometimes deep push back from the user community.
Take, for example, the Python language transition between v2 and v3. There is a deep architectural flaw in how Python v2 misuses strings as byte arrays. We learned this lesson from the C-language. Null terminated strings are a bad idea. Misusing pointers is also the source of many security flaws. Guido and the core committers have tried to fix this flaw in version 3. There is a remarkable amount of pushback from the community. They think their code isn't broken. (It is but they don't run into the problems from international languages fixed in Python v3.) Python v3 is a better language in almost every measure yet after 5 years it only has about 10% uptake. The v2 stubborn mules are making a lot of noise about how Python v3 has failed. I don't believe it has failed. It has started a transition of a 20+ year old technology to a new path. That it has taken 5 years is a testament to the quality and maturity of the engineering team and their respect for the value of legacy code written in Python.
By almost any measure of engineering quality, the Python transition is the model of how architectural flaws should be addressed in open source. It is a testament to the leadership of Guido's team that they persevere. From the independent analyses I've found, it doesn't appear that the OpenSSL has a similar ability to improve their code base. Yet, OpenSSL, as the foundation of e-commerce, is a much more economically important technology than Python.
But in the game of software development, which more frequently ressembles a game of cut-throat, he who invests the least and gets to market the quickest wins. There is little time to justify reengineering "free" software. Just "bolt it on and hope she holds together" is the operating maxim. Well, OpenSSL isn't holding together.
OpenSSL could do with a little bit of corporate stewardship. Google found the problem. (I suspect they found it due to a likely internal audit of every security protocol resulting from NSA having pwned them and everyone else.) Google obviously depends upon OpenSSL for their business. How many Google engineers are in the OpenSSL core committing team? In contrast, how many are in the Linux core committing team?
I hope more companies start funding the development of these critical technologies more directly. Their businesses depend upon it.
People and companies will pick what is best by their own criteria, whether that's software or washing powder.
Cost isn't always going to be the deciding factor, take a look at the web server share going to Apache, the use of free languages like Scala in big corporates like Twitter and various banks etc.
I don't see myself adding a bit of code for an OS like Windows. Why should I, why would MS let me? At the high end, this isn't about cheap, it's about leveraging the available talent pool at many different companies, institutions, and other individual developers where all can contribute for the greater good.
As one datapoint, for our latest project - a cloud based commercial offering, there is no closed or paid for code in it. That's because the free beats it all hollow, technically, published literature, user base, etc.
On the other hand, if I play a computer game, it'll be produced commercially, because they are better currently.
In the end we'll find out what the market thinks on a case by case basis.
I have often wondered about the risks of open source software with regard to security for the Internet. In particular, the Internet of Things.
I have scheduled a live chat on this topic for Wednesday the 23rd at 8am Pacific time. Like EETimes, you have to be registered on the IoT World site to post, but you can come "listen in" without registration as well.
As a special guest, Dave Hughes of HCC, a security middleware provider, will join in to talk about the role of commercial software in security for the IoT.
If this problem were to have happened in a closed source library, it would have taken a report to the vendor, and time for them to priortize the problem, debug it, run it through channels and figure out how to spin the problem to not be their fault, or minimize it, and then schedule a release to contain it.
Since the problem was open source, the code was looked at, and someone other than the vendor was able to find the problem, and the fix was made available. No spin, no wait for the release, just the fix.
I like that the author is asking questions. I think we need to remember not to generalize Open or Closed Source too much, and being biggoted won't help us.
Lots of Open Source doesn't mature to anything useful. Have you perused SourceForge or the like for something new and useful? It's often an exercise in frustration: searching through countless dead, half-baked projects. Quality of code varies in successful and unsuccessful projects alike.
There are an amazing number of very successful Open Source projects. I include OpenSSL in this pool. It's amazing that we haven't experienced more flaws in OpenSSL in the _many_ years we've all been using it. Compare its record to any Open or Closed technology and it will fare well.
I've used OpenSSL to build many comercial products as a Software Develoer and the quality of the OpenSSL code was arguably better than much of the surrounding proprietary code.
I don't believe in perfect code. High quality code does exist. Maybe this incident will reinvigorate us to invest the time and money to make OpenSSL even better than it is today? There aren't many projects of equal importance in the Open or Closed Source ecosystem. It's a fundamental building block of our Internet.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.