@DMc: Good question. My hunch is most of his interactions (and most of the interactions of his peers all across the industry) involve a significant amount of face time at non-public events the big companies sponsor for their partners around particular initiatives and design reviews and customer updates.
I think there's a whole ecosystem of mini fovcused events and meetings most of us never hear about where a lot of relationship building and work gets done in the quiet NDA world.
The PC era is hardly over, but the market has shifted.
The PC market is largely saturated. Most folks who can use a PC likely have one. There is a substantial market, but it's replacements and upgrades. New sales will be relatively few, and the growth beloved of the financial markets won't exist.
Growth is in data centers and mobile, and Intel's challenge is addressing those markets.
They have a leg up in datacenters, because those already use Intel processors. Mobile is much more of a challenge, because ARM has the same sort of leg up there, and Intel is still strugging to match ARM's power efficiency in an environment where battery life is the scarce resource.
But Intel still faces a challenge in retaining datacenter market share. As datacenters proliferate and server density grows, power efficiency becomes an issue there, too, as power costs skyrocket. ARM is poised to challenge them in the datacenter with chips that use less power and generate less heat.
PCs have been faster than the networks they connect to for a while, but network connectivity isn't the only factor. historically. most of what the PC did was purely local - users ran programs stored on local drives, and created and manipulated data that was also local. HD access speed, CPU power, and graphics performance were far more important than how quick the network was. For most PC usage, they arguably still are.
Likely not a lot. How much of it would require physically meeting the people he talks to, in an age of email and video conferencing? It's quite possible he's never actually met some of the folks he keeps in touch with. (And I'd guess some he might have met would be through industry trade shows, where they more or less came to him.)
@Dylan, I stand corrected! You and Rick are the usual suspects! The point I was trying to make is that Hadoop is not an algorithm but a software framework with many basic algorithms. It is up to the developer using this framework to implement application-specific algorithms. It is an implementation of MapReduce which is ideally suited to mapping (sorting, etc) and reducing (basic parameters of the data like frequency) of large sets of data in a distributed computing environment. Central to this ecosystem is the high speed communication and computing & storage infrastructure.
Hadoop's open source tag hasn't hindered its adoption and it is here to stay, gainfully employing many! You should come by one of these days to Yahoo HQ on Mathilda for the Hadoop meetup every third Wednesday @6:00PM. There is plenty of Hadoop talk not to mention free food & beer!
Just to be pithy--and because I am curious--I asked Ronak over email if he uses social networks (the digital kind). Still waiting for a response probably not because he is ON Facebook but maybe because he is INSIDE Facebook. He said he had a couple customer meetings in thre Bay Area.
MP- thanks for weighing and adding your perspective. I wish to point out, for the record, that the above story is authored by Rick Merritt, so it's not my description. But even so, it's not clear to me why software purists would take exception to that description. Just because Hadoop is open source?
@Dylan: some of the software purists will take exception to your description "...accelerates some big-data algorithm such as Hadoop," because Hadoop is an open source platform for distributed and scalable computing.
The trend toward fast, distributed and scalable computing has accelerated recently with the growth of cloud computing & storage. To that end, there has been a growth / consolidation of data centers in the US & EU toward mega datacenters. I doubt if this will be the norm in developing economies where the combination of lower power servers and better computing platforms like Hadoop will trigger the trend towad mini / micro datacenters. Power consumption will be another big driver toward this latter trend.
I keep hearing the PC era is over, but they still take up most of the floorspace in the tech area of most retail electronics stores. I think the reason that PCs are no longer driving innovation in processors is because they're already much faster than the wireless and SMB networks to which the vast majority PCs tend to connect. Thoughts?
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.