After looking the slides, beyond being a very powerful game station I believe that the XBox One has the opportunity of winning the competition by acting as a communication an media hub / set top box.
It not only has tons of multimedia processing horsepower, but includes the most advanced video capture device. Sum that to the acquisition of Skype by Microsoft and you have a media center in which the gaming capabilities are only a little part of the big picture!!
Gamers already know what it is. They almost don't have to market to them anymore, they can let the game companies do that. Now they have to win over mom and dad, and maybe grandma too. If they can do that, they'll OWN the living room.
I think the XB1 is lacking a clear focus from a management standpoint. It tries to do too many things, but arguably none is particulalry attractive compared to the competition.
It plays games, yes. But comments from 3rd party developers suggest that there's a 40% deficit compared to the Sony's PS4 in terms of GPU power. It streams videos, runs Skype, records gaming sessions, and integrates cable TV in its interface. All of these features are behind a $60 per year subscription paywall, on top of the $500 upfront asking price. Speaking of which, PS4's lower price of $400 sure makes Microsoft's offering look a lot less attractive.
It was suggested that the inclusion of the Kinect sensor contributed to XB1's higher price. Microsoft needs to offer a compelling reason (ideally in the form of a killer app, something similar to the wildly successful Wii Sports) for people to be willing to shell out that extra $100 over the competition.
I think the Xbox One is a great piece of engineering. Unforturnately, it was mismanaged from pricing all the way to PR (which MS later admits). Hopefully things will be better after the management shakeup.
yeah, they did seem a bit scattered with this release. to actually be more expensive than the new playstation didn't help at all either. I'm really curious to see how things change as these consoles come into peoples homes. We had previously already established a kind of higherarchy based on price and games where the wii was purchased by fans of nintendo, then people generally chose between xbox and ps3. That decision was often driven based on price. Who knows how it will break down now.
The PS4 GPU is also based on AMD Radeon cores. The big difference is the number of compute units: 18 vs 12 of the XB1, hence the roughly 50% difference in raw power (1.8 TFLOPs vs 1.2 TFLOPs) provided that they run on the same clock speed of 800MHz. Keep in mind that these numbers (on the XB1 part) are not official, just educated guesses.
Microsoft recently commented in an interview that they were able to slightly boost the GPU clock speed without thermal issues. Together with the fact that they are now reporting 1.31 TFLOPs in the slides, I think the estimations are not too far from reality.
Not being interested in games the XBox One seems to be massive overkill, but I'm likely wrong about that.
What is interesting is the Kinet and the processing that makes it work. Will the Xbox One operate without a Kinet attached? Concerned about security? Just disconnect the sensor.
The potential of the system as a spying device is about on par with a laptop with an integrated webcam. Hacking into a Xbox1 probably is a little tougher than doing the same on a laptop since it's likely that Microsoft has probably "hardened" the security of XNA to prevent intrusion into their cash-flow. There aren't as many programmers/hackers familiar with XNA as with other dotNet languages/systems.
My laptop has a piece of electrical tap permanently stuck over my built-in spycam.
With as much CPU/GPU power in the XBox1 seems to have, little parasitic programs might be able to operate almost invisibily. That somewhat scares me, post-Snowden-NSA.
There isn't any obvious malware monitoring going on my daughter's X360. I would hope that the next gen would have something like that. I would assume that users can buy things off the Internet with credit cards, thereby providing a "monetization incentive" for all the accomplished malware writers in eastern Europe and Asia (and other places as well).
There were rumors about XB1 production yield issues, which I think is directly related to the die size of the SoC. With the huge amount of area consumed by eDRAM, Microsoft traded off GPU compute unit count to keep die size in check.
I'll check on the yeild issue with folks at the conference today. They did say it was one of the largest die in consumer electronics--ironically the sort of thing I used to hear from Sony's Ken Kutaragi, Mr. Playstation.
Not surprisingly, Msoft says yields are "on or exceeding expectations"
Can't tell you how many times I've heard that but they make a good case that the chip has 47% cache memory array and a fairly regular graphics core (both repairable) and Nvidia and AMD both make even bigger chips.
Suspected such a large device 363 mm^2 on 28 nm process, fabricated ahead of its process economics, which did not sound like the TSMC mantra of effective capacity utilization vis-a-vis the marginal cost of other high end product's wafer starts.
Did some research and found prior xBox tear down suggesting processor at 20% of B.O.M.
At $499 retail processor cost then approximately $100. That is .27 cents per mm^2 of dice area which is competitive with Intel's fully burdened cost of production.
On AMD price for the design effort I personally suspect the part in the $125 range. That is .34 cents per mm^2 of dice area. This embedded x86 APU in relation to commercial market product is still quite a bargain for the dice area.
Since many threads of comments have come up discussing about the price of the processors, I would like to share my opinion about the technology driven products. The products should be made keeping in mind reusability of the heavily priced components. Do you think it is necessary to have so many high end processor in every home residing in laptops, desktops, game consoles, tablets and many more. The gaming consoles with very varied interfaces should be avoided by the designers and they should use the standard interfaces used in the computers that way reusability of hardware will be more justified.