Since many threads of comments have come up discussing about the price of the processors, I would like to share my opinion about the technology driven products. The products should be made keeping in mind reusability of the heavily priced components. Do you think it is necessary to have so many high end processor in every home residing in laptops, desktops, game consoles, tablets and many more. The gaming consoles with very varied interfaces should be avoided by the designers and they should use the standard interfaces used in the computers that way reusability of hardware will be more justified.
Suspected such a large device 363 mm^2 on 28 nm process, fabricated ahead of its process economics, which did not sound like the TSMC mantra of effective capacity utilization vis-a-vis the marginal cost of other high end product's wafer starts.
Did some research and found prior xBox tear down suggesting processor at 20% of B.O.M.
At $499 retail processor cost then approximately $100. That is .27 cents per mm^2 of dice area which is competitive with Intel's fully burdened cost of production.
On AMD price for the design effort I personally suspect the part in the $125 range. That is .34 cents per mm^2 of dice area. This embedded x86 APU in relation to commercial market product is still quite a bargain for the dice area.
The PS4 GPU is also based on AMD Radeon cores. The big difference is the number of compute units: 18 vs 12 of the XB1, hence the roughly 50% difference in raw power (1.8 TFLOPs vs 1.2 TFLOPs) provided that they run on the same clock speed of 800MHz. Keep in mind that these numbers (on the XB1 part) are not official, just educated guesses.
Microsoft recently commented in an interview that they were able to slightly boost the GPU clock speed without thermal issues. Together with the fact that they are now reporting 1.31 TFLOPs in the slides, I think the estimations are not too far from reality.
Not surprisingly, Msoft says yields are "on or exceeding expectations"
Can't tell you how many times I've heard that but they make a good case that the chip has 47% cache memory array and a fairly regular graphics core (both repairable) and Nvidia and AMD both make even bigger chips.
I'll check on the yeild issue with folks at the conference today. They did say it was one of the largest die in consumer electronics--ironically the sort of thing I used to hear from Sony's Ken Kutaragi, Mr. Playstation.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.