"the block tends to be specified for the toughest requirement and then operated at that level."
I can understand that, but in cell phones the toughest requirement is usually power. I don't know the impact of inexact caclucations on video quality, but I'm willing to bet that a 90% power reduction would trumpt exact calculations in many cases.
I agree that three identical low resolution data engines running the same software would tend to produce identical inexact results making a voting regime redundant. Nonetheless I think there is scope for more creative thinking here.
How about this?
Three (or more) low-resolution data engines running different algorithmic approaches to achieve the same functionality might produce different inexact results that could be averaged to produce a higher resolution final result, but with a significant reduction in energy consumption.
They do indeed put compression/decompression engines in graphics processors.
In fact some systems have ways to offload compression/decompression to the GPU in a heterogeneous processor system.
But as with examples given above the accuracy, loss/losslessness required may depend on application (video versus communications) and as such the block tends to be specified for the toughest requirement and then operated at that level.
Don't they put video decompression engines in processors/video chips nowdays to help with handling MPEG video streams? This seems like the perfect application for inexact calculation.
Also, VoLTE is the up-and-coming voice standard for phones that all the major cell companies are headed toward. This is another application that uses a reasonable amount of CPU power where inexact processing would work and the power savings would be very desireable.
If the power savings were 50% or less I would say that it wasn't very interesting, but at 90% you have the option of running multiple cores on the same data and voting for the correct answer. Three cores with only a .25% variance should be pretty close to reliable, depending on the distribution of the errors. Granted, it would "only" save 70% of the power, but would definitely be worth exploring.
It is also worth looking at where the errors show up. If programmatic logic is not affected, then network routers and many control systems would definitely benefit. I would expect that data collection systems that do advanced math would be leery of it unless the error range could be bounded and manageable.
As we unveil EE Times’ 2015 Silicon 60 list, journalist & Silicon 60 researcher Peter Clarke hosts a conversation on startups in the electronics industry. Panelists Dan Armbrust (investment firm Silicon Catalyst), Andrew Kau (venture capital firm Walden International), and Stan Boland (successful serial entrepreneur, former CEO of Neul, Icera) join in the live debate.