Tegra 3 is old chip already. I am expecting nVidia will come up something new to compete with those new Samsung and TI offerings. Since Quad-A9 has roughly the same performance of Dual-A15, there is no point for nVidia to produce a version of Dual A15 chip. If I am nVidia, I will either release Quad-A15 or 64-bit home grown ARM processor to gain market leader position.
This contributes another big jump in the media capabilities of portable devices like tablets which are hitherto seen as pure communication and medium level entertainment devices.
This means Samsung is gunning for the 1st position in all handhelds including tablets and smartphones.
Should be interesting to see who wins the hype contest at CES in January. Based on sample announcements so far we should see products with Samsung Exynos (2ghz dual-core A15s plus some multi-core Mali GPU built on a 32nm process), TI OMAP5 (2ghz dual-core A15s plus 2 M4s and a dual-core PowerVR GPU built on a 28nm process), Qualcomm S4 (1.5ghz dual-core Kraits plus Adreno GPU and integrated baseband), Nvidia Tegra 3 ( 1.5ghz quad-core A9s with some multi-core Nvidia GPU), and maybe even a competitive ST Nova A9600 offering. Just enough differentiation, to enable a great debate among the technical media.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.