Let’s suppose that the current exponential transistor unit volume growth
continues another forty years. Since the learning curve accurately
predicts the cost (and consequently the price) if you know the volume,
that means that the cost per transistor would decrease 99.999994 percent by
2052.1 But maybe you think that 2052 is a bit too
ambitious a comparison. If we assume that Moore’s law will continue for
the next 10 years at least, then what is the result? Well, based on the
longstanding trend of the learning curve, we should expect to see a 58x
increase in transistor count for your inflation-adjusted dollar.
The result: Your MP3 player could store 1,237 (2052 = 3.62E+08) two hour movies2, 371,005 (2052 = 1.08E+11) songs3, or 1,325,017 (2052 = 3.87E+11) copies of Tolstoy’s classic War and Peace4 (2052 = 2,607 digitized copies of the entire printed contents of the US Library of Congress)5.
to provide this capability? Current phase change memory
technology gives us almost two orders of magnitude in memory density.6
Combining this with stacked die multiplies it by up to another two
orders of magnitude. There are also other even more exotic but plausible
forms of storage on the horizon, for example the recently announced
advances in DNA storage, where information is encoded and stored on the
strands of synthetic DNA. This medium would provide 4,265,625,000x
(4.27E+09) improvement in storage capacity over the latest 20nm NAND
Chip7. While this technology is in its infancy and
much too slow and expensive for commercial use, production costs of
generating DNA sequence data have declined by roughly 10x per year since
2001, according to the Human Genome Institute.8 If this remarkable rate of improvement were to continue, DNA storage could be cost competitive with NAND flash within ten years9. Not likely.
But very likely within the forty-year horizon.
all this unrealistic? Maybe. Even if the unit growth rate doesn’t
continue to grow exponentially, the ideas available today could take us
wherever it does grow. And that will be a very different world of
information from today.
--Walden Rhines is the CEO of Mentor Graphics.
Notes and references:
1 Calculations are based on the long-term trend of reductions in cost per transistor of 33.4% from 1954-2011. 2 Based on Apple’s calculation of average 1.5GB file size for 2 hour standard definition movie, and 58x increase (or 1,856GB) compared to current mid-tier 32GB iPod Touch. 3 Based on 58x increase over 32GB iPod Touch, which Apple reports can hold 6,400 songs. 4 1.4MB free iBook, by Publisher Bryant Tang, released July 3, 2012, downloaded from iTunes August 22, 2012. 52000 study by UC Berkeley professors Peter Lyman and Hal Varian, estimated the size of the digitized print collections of the Library of Congress to be 208 Terabytes. 6 Charles Dennison, “Scaling Challenges and Market Opportunity for Phase Change Memory.” 7Comparison based on weight
-- Samsung Pro Class 1500 64GB NAND memory chip weighs 0.6 grams, vs.
DNA which is said to be able to hold 455 billion GBs per gram. 8 Robert Lee Holtz, “Future of Data: Encoded in DNA,” Wall Street Journal, August 16, 2012. Retrieved August 27, 2012 from Wall Street Journal. 9 DNA calculation based on $0.10 per million bases figure quoted in Holtz, “Future of Data,” Wall Street Journal. One million bases are roughly equivalent to 1 megabyte of computer data storage space according to the Human Genome Project. Current price of Flash memory based on sales price on Newegg.com (32GB for $26.99), as quoted on http://www.jcmit.com/flashprice.htm.
We need a disruptive technology that can break the extreme capital intensive model. Until then, life under someone else's reference flow will be bad. Why stick around when the same skill can be applied to social media?
To the statement - "While there are limits to the amount of information they can assimilate effectively, there is a virtually unlimited desire to have access to more information if it is affordable - I would add - if it is affordable "and is actionable knowledge"
Turning information into actionable knowledge means huge increases in processing power which means huge increases in transistors.
Of course, the algorithms for all this knowledge extraction is a different matter altogether.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.