Whoa, what's bitten you? Flash running your PC hot has exactly NOTHING to do with this. Flash is just a piece of junk that needs to die. Even Adobe realizes this--now if only web developers did too...
Recompiling source to native code is the most efficient thing to do, and that's what I was talking about. Emulating a different instruction set in real time is not efficient.
Yeah, and this is exactly the reason why THIS happens: When I run FLASH (= Adobe misery) my PC almost gets on fire, all due to the damned protocol and layer stackers out there. Why don't you software programmers not do a proper job?
FYI, Here's some analysis that was apparently cut from the current version of the story:
“It’s very possible this could be useful technology in a 2014 time frame,” said Kevin Krewell, senior analyst with market watcher Linley Group (Mountain View, Calif.).
Startup Transitive Technology helped Apple and SGI transition to the x86 from PowerPC and MIPS respectively before Transitive was acquired by IBM. Emulation “could be used for migration, but not as a long term strategy,” Krewell said.
Emulation is typically used when the new architecture has higher performance than the old one, which is not the case—at least today--moving from the x86 to ARM. “By the time this software is out in 2014 you could see chips using ARM’s V8, 64-bit architecture,” Krewell noted.
“That said, you will lose some of the power efficiency of ARM when doing emulation,” Krewell said. “Once you lose 20 or more percent of efficiency, you put ARM on par with an x86,” he added.
Fergie 65; on performance will lesson under emulation; exactly. If you mean by binary that the performane hit will lesson for applications
recomplied for 'specific x86 instruction calls" exactly. Hardware substitution learning always starts addressing core instruction translation equired for target application.