People and companies will pick what is best by their own criteria, whether that's software or washing powder.
Cost isn't always going to be the deciding factor, take a look at the web server share going to Apache, the use of free languages like Scala in big corporates like Twitter and various banks etc.
I don't see myself adding a bit of code for an OS like Windows. Why should I, why would MS let me? At the high end, this isn't about cheap, it's about leveraging the available talent pool at many different companies, institutions, and other individual developers where all can contribute for the greater good.
As one datapoint, for our latest project - a cloud based commercial offering, there is no closed or paid for code in it. That's because the free beats it all hollow, technically, published literature, user base, etc.
On the other hand, if I play a computer game, it'll be produced commercially, because they are better currently.
In the end we'll find out what the market thinks on a case by case basis.
Open Source Software is quite expensive to produce. Just integrate the man hours that goes into something like the Linux kernel or OpenSSL and then multiply by the loaded overhead of an average programmer. You will be shocked at the amount of time-is-money being contributed to OSS efforts.
Yet each contribution is frequently a point patch. They typically solve a problem the patch submitter has. While it is the "job" of the core committers to rationalize all of the patches and move the technology forward in a coherent fashion, there is both little reward for doing so and sometimes deep push back from the user community.
Take, for example, the Python language transition between v2 and v3. There is a deep architectural flaw in how Python v2 misuses strings as byte arrays. We learned this lesson from the C-language. Null terminated strings are a bad idea. Misusing pointers is also the source of many security flaws. Guido and the core committers have tried to fix this flaw in version 3. There is a remarkable amount of pushback from the community. They think their code isn't broken. (It is but they don't run into the problems from international languages fixed in Python v3.) Python v3 is a better language in almost every measure yet after 5 years it only has about 10% uptake. The v2 stubborn mules are making a lot of noise about how Python v3 has failed. I don't believe it has failed. It has started a transition of a 20+ year old technology to a new path. That it has taken 5 years is a testament to the quality and maturity of the engineering team and their respect for the value of legacy code written in Python.
By almost any measure of engineering quality, the Python transition is the model of how architectural flaws should be addressed in open source. It is a testament to the leadership of Guido's team that they persevere. From the independent analyses I've found, it doesn't appear that the OpenSSL has a similar ability to improve their code base. Yet, OpenSSL, as the foundation of e-commerce, is a much more economically important technology than Python.
But in the game of software development, which more frequently ressembles a game of cut-throat, he who invests the least and gets to market the quickest wins. There is little time to justify reengineering "free" software. Just "bolt it on and hope she holds together" is the operating maxim. Well, OpenSSL isn't holding together.
OpenSSL could do with a little bit of corporate stewardship. Google found the problem. (I suspect they found it due to a likely internal audit of every security protocol resulting from NSA having pwned them and everyone else.) Google obviously depends upon OpenSSL for their business. How many Google engineers are in the OpenSSL core committing team? In contrast, how many are in the Linux core committing team?
I hope more companies start funding the development of these critical technologies more directly. Their businesses depend upon it.
hasenmi hits the nail on the head. If this occurred in proprietary code, we'd never know about it.
The true value of free is not in the dollars spent, but in the openness. When there is a problem it is there for everyone, and being open there are many more eyes looking for the problem. The openness ensures there is rapid pressure to fix the problem.
By comparison, for a proprietary tool such as the Microsoft Crypto API, we have to rely on Microsoft assuring us it does not have such a bug. And just Microsoft engineers looking for bugs. And Microsoft's professionalism in fixing and publicizing any bugs they do find.
Many free and open source projects are very well resourced - think of Linux, Apache and the GNU Compiler Collection. All developed by very large communities of well paid full-time professionals. There is plenty of commercial incentive to resource these projects properly.
But even smaller projects actually look quite good compared to much proprietary software. For example, a lot of proprietary EDA tools are provided by small companies or small departments of larger companies, where the R&D team numbers less than 10, and is doubling up providing second line support. By comparison even small team free software looks pretty good, and you still have the freedom of fixing it yourself - no dependency on the supplier here.
The article is correct that we have to look at the cost of free software. But when we look at all the costs, it often turns out that freedom can deliver much better value than proprietary software.
The facts are that Open Software is generally of a higher quality than proprietary software. From an article just published on 4/16/14 Coverity reported that the defect density for Open Source code was 0.59 while the proprietary code is 0.72.
There were lots of other useful information in the article such as there were 17 times as many defects fixed in proprietary code as in the Open Source code.
So - stop with the soul searching and just get on with it.,
Replay available now: A handful of emerging network technologies are competing to be the preferred wide-area connection for the Internet of Things. All claim lower costs and power use than cellular but none have wide deployment yet. Listen in as proponents of leading contenders make their case to be the metro or national IoT network of the future. Rick Merritt, EE Times Silicon Valley Bureau Chief, moderators this discussion. Join in and ask his guests questions.