Breaking News
Design How-To

Compression schemes redux

1/26/2009 05:00 AM EST
1 Comment
More Related Links
View Comments: Newest First | Oldest First | Threaded View
User Rank
re: Compression schemes redux
Kevin_Neilson   2/3/2009 7:33:12 PM
I think the author neglects the massive potential compression across time due to slowly-changing images, but regardless, compression of a million is possible in theory. Imagine the complexity in the human body which originates with a mere 12 billion bits in the DNA, the bulk of which are unimportant. A sufficient decompression engine could extrapolate the entire structure of any adult from the DNA alone. Or imagine this scene: "Ingrid Bergman leans against a maple tree trunk, one eye peering from beneath a wide-brimmed hat, waiting impatiently for her visitor." The decompression engine of my mind took those few bytes of information and rendered them into an image which might take megabytes to store on a DVD. There is no reason a CPU could not do the same, given a sufficient image library. In this case, almost any maple tree image suffices without a loss of information.

Top Comments of the Week
August Cartoon Caption Winner!
August Cartoon Caption Winner!
"All the King's horses and all the KIng's men gave up on Humpty, so they handed the problem off to Engineering."
Like Us on Facebook Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)
EE Times on Twitter
EE Times Twitter Feed
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.
Flash Poll