Breaking News
News & Analysis

Google Ramps Up Chip Design

A long trail
2/12/2014 02:08 PM EST
29 comments
NO RATINGS
< Previous Page 2 / 2
More Related Links
View Comments: Threaded | Newest First | Oldest First
rick merritt
User Rank
Author
I'm hungry for details
rick merritt   2/12/2014 2:33:16 PM
NO RATINGS
I've reached out to a couple Google contacts for an interview, but given the secretive posture the search giant historically has taken around its data center technologies, I don't expect any substantive responses any time soon.

That said, I am ready any time for an interview with anyone who knows more about this topic.

AnySilicon
User Rank
Freelancer
Android
AnySilicon   2/12/2014 2:55:07 PM
NO RATINGS
Could it be a CPU optimized for Android that Google can sell together with their OS? it the only link I see to the mobile market & IoT.

krisi
User Rank
CEO
Re: I'm hungry for details
krisi   2/12/2014 2:56:02 PM
NO RATINGS
This is pretty cool development for all IC design guys...I am hungry for any info too

Devashish Paul
User Rank
Apprentice
Re: I'm hungry for details
Devashish Paul   2/12/2014 8:35:09 PM
Rick, whether you are Google or Baidu, I can't see any benefit in just doing Vanilla processors that you can get from any ARM or x86 vendor. I'd imagine that you want to integrate CPU, GPU and fabric into a single low latency, highly integrated SoC. I'd imagine that something highly optimized for compute along the lines of what you'd get if you integrate Xeon + nVidea + fabric would be where you want to go to cram in a lot of compute at data center scale with low latency without the overhead of a lot of NICs etc and keep system level power down.  Trying squeeze down power and latency around what Intel gives you is probably not going to cut it.

alex_m1
User Rank
CEO
Re: I'm hungry for details
alex_m1   2/22/2014 8:57:20 PM
NO RATINGS
this might be what Google is.working on: web.media.mit.edu/~bates/Summary_files/BatesTalk.pdf‎ http://www.bdti.com/InsideDSP/2013/10/23/SingularComputing

Sheetal.Pandey
User Rank
Manager
Re: I'm hungry for details
Sheetal.Pandey   2/25/2014 2:18:14 PM
NO RATINGS
Google should be careful in getting into chip design as its not their forte. They good with software and think twice before getting so selective in hardware business. In hardware business losses are huge.

TonyTib
User Rank
CEO
Maybe it's exploratory
TonyTib   2/12/2014 3:59:32 PM
NO RATINGS
I was at a motor control seminar last year; 3 Google employees attended, but wouldn't say why they were there .  My guess is just to learn.

_hm
User Rank
CEO
Good News
_hm   2/12/2014 5:57:27 PM
NO RATINGS
Good News that Google is rampingup efforts for chip design. These should be for more consumer product design. For these type of products one need to have custom design ICs.

Along with processing, it will involve MEMS and other mixed signal technology.

betajet
User Rank
CEO
I would guess custom search engine
betajet   2/12/2014 8:15:03 PM
NO RATINGS
There's no point in Google making consumer chips.  Allwinner and MediaTek can make such chips cheaper than Google.  It would be like Google making its own phones -- they're much better off setting the software standard and letting their partners compete against each other to drive the price down.

Besides, why does Google have Android?  So that people can get access to mobile advertising, because that's how Google makes money -- selling advertising.  And they target their advertising based on what you've searched for.  Search is the crown jewels, and it takes a phenomenal amount on computing power and electricity.

So here you have a well-defined highly-parallel problem.  Sounds like a perfect application for custom silicon.  Instead of interpreting search software on a general-purpose CPU, do it directly in hardware so that data isn't being copied redundantly, which is what really consumes the energy.  If custom silicon cuts the Google electric bill in half and doubles the performance of each data center, the silicon pays for itself pretty quickly.  And I'm just being conservative with that factor of 2.

JMO/YMMV

markhahn0
User Rank
Rookie
processor in memory in network
markhahn0   2/12/2014 9:36:29 PM
NO RATINGS
If I were with Google, I wouldn't be satisfied with shaking up SDN, or tweaking out some minor mod of a conventional CPU node.  I think that's what's so tedious about all the coverage of FB's boring form-factor changes.

Google is in a position where they can look at fundamental changes in programming mode, in ways that conventional suppliers can't.  For instance, GPUs have demonstrated that there's a LOT of parallelism out there, in spite of the horrible programming model.  Google could be putting dram in-package.  They could find a nice uniform way to address large numbers of these nodes (sort of a merged network/dram fabric).  If you really go SDN, it doesn't make a lot of sense to stick with the artifacts of traditional ethernet designs (subnets, vlans, ISO layers).

People usually think of this "custom or not to go custom" question as hinging on how much of the conventional architecture can be jettisoned.  (ie, if your nodes have nothing but cpu, dram, flash and fabric, you sure don't need 8 ports of SATA or a 6-port USB3 controller.  but you probably do want some kind of management coprocessor)  But Google should be thinking about more fundamental change, not just subtractions...

frosty_the_snowman
User Rank
Rookie
Re: I'm hungry for details
frosty_the_snowman   2/12/2014 9:50:34 PM
NO RATINGS
please send url of blogger. thanks, -richard

rick merritt
User Rank
Author
Re: I'm hungry for details
rick merritt   2/13/2014 1:41:41 AM
NO RATINGS
@GSMD Dan Luu's blog is making my head spin!

alex_m1
User Rank
CEO
AI , computer vision chips , or just via based asic.
alex_m1   2/13/2014 5:46:20 AM
NO RATINGS
Google currently have big power barriers in google glass and robotics which prevents them from building products they want. So some sort of low power custom fit image processor is one possibility. Another possibility is some sort of deep learning(a new hot AI algorithm) algorithm , power optimized , that they can use in glass,robots and even in the data center.


The other possibility is of course accelerators, but do they have enough volume to justify doing stuff in 28nm , considering the fact that they can request custimzation from big suppliers(probably pending on big orders) ? My guess that for accelerators theyre more likely to use easic(via programmable asic) or something similar.

zewde yeraswork
User Rank
Blogger
Re: AI , computer vision chips , or just via based asic.
zewde yeraswork   2/13/2014 10:22:17 AM
NO RATINGS
Google has demonstrated its interest in AI by acquiring DeepMind recently. I think any of those options for Google make s sense, as far as what they'd like to do with silicon. But it's still strange that they think this is an area where they need to expand--into silicon--when ARM is coming out right now and offering all of these new options in the datacenter.

JimMcGregor
User Rank
Blogger
Re: AI , computer vision chips , or just via based asic.
JimMcGregor   2/13/2014 12:15:38 PM
NO RATINGS
There has been speculation about all the internet powerhouses with large data centers developing their own chips. While it is a possibility, it is unlikely. Unlike the mobile market where Apple forged ahead with its own chip design because it was not satisfied with the solutions in the market, there are many options for custom server chips. In fact, this is a key part of AMD's new strategy. I would expect companies like Google and Facebook to partner with those companies that have the necessary IP and expertise to develop silicon solutions that meet their specific requirements. However, it is still in thier best interest to have some expertise in-house to drive the effort.

alex_m1
User Rank
CEO
Re: AI , computer vision chips , or just via based asic.
alex_m1   2/13/2014 1:29:30 PM
NO RATINGS
That probably makes the most sense since it  seems google's asic team is pretty small. Maybe they're just working toghether with their suppliers and creating unique IP to be integrated. 

Of course this couold be a step on a long road for google to build it's own hardware expertise.

rick merritt
User Rank
Author
Re: AI , computer vision chips , or just via based asic.
rick merritt   2/13/2014 4:43:24 PM
NO RATINGS
@Jim McG: Indeed, Intel said last year they are doing custom bins and firmware spins for large (read: data center) customers.

Anyone heard of Intel or AMD contracting to do a full metal spin for a, say, Google or Amazon?

alex_m1
User Rank
CEO
Re: AI , computer vision chips , or just via based asic.
alex_m1   2/13/2014 1:25:37 PM
NO RATINGS
As far as i can tell , arm(and it's ecosystem) is coming up with generic chips. I see nothing about AI , nothing specific to google glass, and no very customized chips(for example chips customized for memcached processors with a world wide market of less than million units with google maybe needing 100K).

DrFPGA
User Rank
Blogger
One or Two IC Guys...
DrFPGA   2/13/2014 9:27:13 AM
NO RATINGS
isn't a project. Google will just buy up a company when they are ready to do something real. I agree the most likely possibilities are the areas where they have very specific needs- specialized search engines (where they don't want to give out their algorithms to another company) seems like the best bet...

rick merritt
User Rank
Author
Re: One or Two IC Guys...
rick merritt   2/13/2014 4:39:37 PM
NO RATINGS
@DrFPGA and others: Yeah, like a MapReduce accelerator or other Google-specific algorithm accelerator.

Is there much benefit for specific hardware there vs using an FPGA/GPU or other parallel processor?

alex_m1
User Rank
CEO
Re: One or Two IC Guys...
alex_m1   2/13/2014 6:51:57 PM
NO RATINGS
@rick I don't think you can accelerate mapreduce much, only the operations it does which change by application.

Google uses a variety of algorithms for search. Some of those are machine learning and esp. deep learning algorithms which are quite new and are kinf of breakthrought articial intelligence. For those ASIC's could become cheaper and lower power than GPU's , see [1]. Those same algorithms are also usefull/critical in glass, robots,phones and other places that require AI.

Another option cache server(memcached) acceleration. recently FPGA's shown great promise , and asic could do better.

There are also other search algorithms that could benefit, but accelerating those throught hardware is quite an old idea(and one could use FPGA/GPU) ,so we should ask why now ?


[1]http://www.technologyreview.com/news/523181/an-ai-chip-to-help-computers-understand-images/

 

zewde yeraswork
User Rank
Blogger
Re: One or Two IC Guys...
zewde yeraswork   2/14/2014 9:41:13 AM
NO RATINGS
It's those deep learning algrotihms that could take Google into the future with AI and all that comes with it. They already use a broad range of algorithms for their searches. Now with DeepMind and with parallel processing cores, possibly running on silicon that they themselves make, the sky is the limit.

chipmonk0
User Rank
Manager
Apple & Google are now like the Mainframe Co.s of the '50s & '60s
chipmonk0   2/14/2014 5:25:19 PM
NO RATINGS
monopoly / giants who design processors in-house to optimize for their proprietary algorithms. only they are cleverer, won't get bogged down running their own Fabs.

and then along comes a micro-processor or its next equivalent and the shake-out starts all over again.

There will always be some new innovator ready to throw the hammer ( as in Apple's 1984 commercial ) and shatter the status quo monopoly.

 

markwrob
User Rank
Rookie
google's universe is so big already...
markwrob   2/13/2014 11:28:53 AM
NO RATINGS
...and growing faster it seems every month.

While all the speculation in the posts here are good and thoughtful, I'd bet they are only scratching the surface of what Google is thinking about using their own chips for.  Addressing power consumption in their data centers seems like the sort of low-hanging fruit that justifies hiring hardware designers to begin with.  And Google has vast troves of their own data on just what processes and applications consume the most power and time today, I'm sure.

But after the largest of those are addressed, what's next?  They just noted they need solutions in network latency to handle the surge in connections to IoT, wearables, etc.  One way they might attack that is pushing specialized processors closer to their network fringes so the appropriate levels of traffic can be moved where it matters most.  Then there's the new robotics initiatives with the likes of Foxconn and aided by acquisitions like Boston Dynamics.  This area alone could be quite high-profile and high-margin.

The sky's the limit, it's only the tip of the iceberg, pick whichever cliche you like, they all fit here.

rick merritt
User Rank
Author
Re: google's universe is so big already...
rick merritt   2/13/2014 4:41:22 PM
NO RATINGS
@MarkwRob: You suggest a sort of corporate IC group that could serve many masters from Project Glass to data center networking. Hmmmmmmm.

markwrob
User Rank
Rookie
Re: google's universe is so big already...
markwrob   2/13/2014 5:54:58 PM
NO RATINGS
@Rick,

Right, it could serve many masters there.  And like others have suggested, Google's group might best be used to set up the architectures (HW,FW,SW) and then partner with providers to implement their visions.  Some of those could be proofs of concept, others released as open source, others kept as proprietary though I think the last segment would be small.  The more Google can get their ideas used by the world (thus building scale and driving down costs), the more ads they can sell into the world.

AZskibum
User Rank
CEO
Re: google's universe is so big already...
AZskibum   2/15/2014 10:28:16 AM
NO RATINGS
If you consider all the things Google is investing in besides their bread & butter search & data centers, there are lots of reasons for them to have an in-house IC design organization. I agree, this small team is just the tip of the iceberg.

Bruzzer
User Rank
Freelancer
In support of silicon design producers
Bruzzer   2/13/2014 8:43:45 PM
NO RATINGS
I also believe best for cloud providers to work with and compliment, microprocessor and other silicon design producers, especially ARM 64 design producers, aiding too add some necessary production economic volume to that business equation. Even if held captive by customers so long as the design development makes complimentary financial sense. In a concentrating industrial setting drawing that line somewhere just makes sense, on the cost expertise question for sustainable industry, trade, employment, supporting gross domestic product, even at you know who.

I have also speculated x86 decode engine hard in ARM, instruction look up tables, that are certainly available from the other guess who, yet others are known working on breaking that aspect of Intel monopoly once again. That would certainly require the deep pockets of public cloud provider's too keep Intel legal, and financial guilds at bay.

Noteworthy data analytics, and on batch processing, this analyst has determined from constant audit, multiple acceleration approaches and techniques, that are nascent realities of heterogeneous compute platforms capable of entering high end Xeon space. 

Mike Bruzzone, Camp Marketing

 

  

hazydave
User Rank
Manager
Some problems may need it
hazydave   2/20/2014 4:08:23 PM
NO RATINGS
If you look at high end blades in some big database systems, they're combining off-the-shelf multi-core x86 chips with FPGA-based hardware to accelerate distributed database processing. Google is bound to have problems that can be solved in a similar fashion. 

But they're also really big, and the cost, in parts, power, and speed limits, of FPGAs may be suggesting that custom silicon is the answer. And that kind of thing gets even more interesting if you build it into your CPUs, saving power, eliminating any bus or communications bottlenecks between the two processor areas, etc. 

Or maybe it's just plain old ARM chips they can buy from multiple sources. 

Radio
NEXT UPCOMING BROADCAST
EE Times Senior Technical Editor Martin Rowe will interview EMC engineer Kenneth Wyatt.
Top Comments of the Week
Like Us on Facebook

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)
EE Times on Twitter
EE Times Twitter Feed
Flash Poll