I'll be curious to see more about this.
The biggest issue I'm aware of for the gaming crowd looking for that last bit of performance is heat dissipation. A few years back, one of the extreme tech sites ran a feature on how they'd overclocked one of the Intel Pentium models to 5 ghz. They used liquid Nitrogen to cool it. It was a "Kids, don't try this at home." sort of thing. They could point to it and say "We *did* it!", but it wasn't a generally applicable solution.
I also wonder about the graphics core. My feel is that Nvidia has the edge in gaming, and there will be gamers who might want to use the AMD CPU but not use the graphics core, and let Nvidia graphics cards handle the video. There will probably be a custom BIOS to make that sort of thing selectable.
The PS3 used the Cell processor, which was a very unusual 7+1 core processor from Sony/IBM/Toshiba. Aside from being difficult for developers to use, development on the Cell processor has basically stagnated for the past 4 years, so it is no surprise it was not a contender for the new generation of consoles.
(The Wii and Xbox360 both used more conventional PowerPC processors from IBM, so pricing might be an issue for them.)
Replay available now: A handful of emerging network technologies are competing to be the preferred wide-area connection for the Internet of Things. All claim lower costs and power use than cellular but none have wide deployment yet. Listen in as proponents of leading contenders make their case to be the metro or national IoT network of the future. Rick Merritt, EE Times Silicon Valley Bureau Chief, moderators this discussion. Join in and ask his guests questions.