There's an interesting article in the July 2010 edition of Wired magazine (yes, it's the print publication I subscribe to, but only because the FAA won't let us read electronically during takeoff and landing--one of the best times to read). The article is about Sergey Brin, the co-founder of Google. Brin estimates that he has a 50/50 chance of contracting Parkinson's, and he is helping fund research to find a cure--hopefully, before he develops the debilitating disease.
The entire article is intriguing--what Parkinson's is, its possible genetic links, and the personal story of an intelligent, highly successful man. Yet, what fascinated me most was an underlying theme that the standard scientific method as we've known it could become extinct in the future.
You remember the scientific method: propose a hypothesis, design tests, analyze results, repeat until convinced. This standard method for gaining knowledge--for seeking the truth--has been used for a thousand years and has led to countless discoveries and breakthroughs.
The Internet, or I should probably say the Information Age, could bring about the demise of the standard scientific method. How? By virtue of the massive amount of data that continues to be produced--prior to any hypotheses. The new scientific method, as talked about in the Wired article, could look like this: scan the data, look for patterns, draw conclusions, find truth. No more would a scientist have a sudden thought and seek to prove it. Instead, "regular" people would contribute data that, when aggregated, would reveal the secrets of science.
With an estimated 2 billion users of the Internet today, the amount of data they can provide about a given problem is enormous. In the case of disease, for instance, people who report on their health and living conditions could reveal commonalities that today's monster search engines could uncover: things that a limited set of tests, no matter how carefully thought out, could not. Disclosure would be voluntary--I don't want to get into privacy issues here--and I suspect people who become ill or have loved ones in danger would be more than willing to contribute information to finding a cure, and even better, prevention.
The standards that I deal with every day are miniscule and fleeting compared to the scientific method. When a standard as ingrained as the scientific method is abandoned, I'm in awe.
Mr. Linares, far from reality. Please name us a few of your examples where an authoritarian ruler was able to establish a truthful progress of science (?), you may be confusing political rule and control of masses as search for truth (sic).
Have you ever heard of Socrates, Plato, & other Greek thinkers among the great or how about Babylonian astronomers that started to comprehend the universe and predictive planetary system. Both Western and Eastern cultures have progressed beyond what you are quoting!
By my calculations there are at least 26 centuries of human development around the scientific method, just to remind you of OUR human achievements.
"scientific method...has been used for a thousand years"
Really? I would put it down to a couple of centuries. For the most part of history authority was the path to truth. And that is still the case in many parts of the world.
Please rest assured that the scientific method is safe and sound, regardless of the amount of data that is accumulated prior to forming an hypothesis. Your definition of "sudden thought" is the definition of a guess -- well measured theories are not guesses, but are instead a trend and pattern analysis that (for the moment) can only be achieved by living human brain.
While many theories may appear to start as guesses, they are almost certainly educated guesses. Some individuals are adept at subliminal analysis, and if these same people are also adept at describing and communicating their understanding, they move the forefront of their field.
The true measure of a theory is its predictive analysis. Anyone can draw a trend line, but providing an underlying model that properly extrapolates and predicts new situations is what establishes theory as law. This will be true independent of the amount of data in that original trend pool.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.