Breaking News
Newest First | Oldest First | Threaded View
Page 1 / 6   >   >>
User Rank
Re: Innovation will continue
garydpdx   8/4/2013 12:06:31 PM
Performance progress has traditionally been lead by advances in semiconductor process but if we are approaching a plateau (and I'm optimistic that progress will be challenging both scientifically and financially, not things grinding to a halt), then we still have algorithm and architecture in which to innovate.  The challenges to advancing in semiconductor process also challenges incumbent RTL-based design since larger system level design perspectives will be required.  Yes, that horse has been beaten for a long while but there is growth (starting from an admittedly small base) as more and more people "get it" and turn to ESL.

User Rank
Re: Is Moore's Law Dead? Does It Matter?
resistion   8/3/2013 9:03:24 AM
For the others to catch up to US in technology, the US would have to be essentially static for an extended period of time. You can be assured that is probably not the case. In fact, it is probably more likely the others will try to stay a few steps back from the US leading edge which they fear is fraught with unknown risks they cannot learn fast enough about.

I'd be more worried about something all of mankind gets stuck on, due to fundamental physics, like related to quantum or entropy.


Tom Murphy
User Rank
Re: I propose the opposite is true
Tom Murphy   8/2/2013 11:25:02 AM
Bert: well, I respectfully disagree with your conclusion.  Moore's law leads to faster chips, but the military tends to use the most advanced chips -- which are more expensive even under Moore's Law.  That favors the richer country, which economists say will be China in the not-too-distant future.  If Moore's law ends, and chips stop get faster at that rate, that advantage is narrowed or -- in the extreme -- even eliminated.

Of course, the wealthier country could still benefit from other advanced technologies that would speed up processing outside the chip.  And all major countries already have enough firepower to blow up the planet several times over, so this is probably academic.

On the civil liberties front, the demise of Moore's Law could limit the mass processing of big-data to the point where there is no instant analysis of individuals based on "total information awareness" programs. That would cut into the expansion of intelligence programs like the NSA's.


User Rank
Re: Moore's "Law" is NOT the FIRST Paradigm to provide exponential growth of computing
jaybus0   8/2/2013 10:03:38 AM
I agree with your assessment of a quantum computer. However, technology being designed to implement a quantum computer could easily transfer to classical computers. Particularly, the silicon photonics that Intel, BM, and several others are working on could facilitate a paradigm shift such as suggested by Kurzweil.

What if an optical chip-to-chip data bus was as fast as the on-chip data bus? Just as we today connect a component implemented in one area of the die to another component in another area of the die, a chip-speed optical bus would allow inter-die components. This would be a paradigm shift. Instead of distributed computing using clusters of individual and independent computers, we could treat individual dies as a single virtual die, or in other words a very large SoC. In addition, this virtual SoC would be scalable to fit in whatever power envelope was available. There are many possibilities, but this one seems quite feasible in the near future.


User Rank
KB3001   8/2/2013 9:50:50 AM
"So, what happens if the whole world has equal access to technology? Does stability depend on one country having a bigger stick than everyone else?"


The truth is precisely the opposite, peace takes place when there is an equilibrium of power, and that happens when there is equal access to technology, not when one country has a bigger stick than everyone else.


The American expert is right in seeking US advantage in high technology, but that is not necessarily in the interest of world peace :-) Others have to seek the same advantage and at some stage, they will realize that their interests lie in collaborating and cooperating rather than constantly seeking an advantage over the others. It's a process and we are nowhere near maturity....

User Rank
Re: Innovation will continue
KB3001   8/2/2013 9:46:15 AM
Could not agree more. I say bring it on! Our over-reliance on transistor level improvements over 30 years or so made us LAZY, but what's 30 years in the history of human progress? nothing. We ought to look at other levels including:

- Algorithmic: for many decades now, our way of thinking about problem solving has been biased towards Von-Neuman implementation platforms with semiconductor chups. Let's look beyond that and devise new algorithms for wider platforms and paradigms.

- Architectural: Hardware is not just about transistors, it's about computing and communication designs and architectures. I do not think we have explored the realm of possibilities here adequately, there is still a lot to be done.

-Physical: Binary electronics using semiconductors is one of many possibilities for computing, storage and communication. Here again, we have scratched the surface.

To solve our computing, storage and communications needs, we must train a new breed of scientists and engineers. Out with the modularization, fragmentation and specialization of training and teaching, and in with holistic education.


Back to basics!



User Rank
Reformulate it
seaEE   8/2/2013 1:02:02 AM
It may be that Moore's law needs to be reformulated, not in terms of transistors, microns, and nanometers, but purely in terms of the transfer of information.

I think that given advances in physics, materials, and new computing methods, Moore's Law, whatever it really is, will continue in some way shape or form.



User Rank
Innovation will continue
vharihar   8/1/2013 11:43:37 PM
Lets not confuse the *means* with the *end*.

I believe that Moore's law as *conventionally* stated will stop (transistors shrinking and # of transisitors doubling every N years).

However, I feel that there will be innovations that will help continue forth with improvements in performance, power consumption and functionality (recall that these are the *end* objectives that we are really interested in. Scaling to smaller dimensions has just been the *means* of achieving this *end*).

And I feel that these innovations will enable using the same or maybe slightly *longer* channel length transistors (say, 45 nm) than where we are headed towards (longer channel lengths imply better yields), and yet result in better performance and lower power. These innovations may be in the form of using a different material than Silicon, etc.


User Rank
What's a law, anyway?
MeasurementBlues   8/1/2013 11:20:35 PM
Before you can declare Moor'es Law dead, consider what ti reall is, a prediction. Then there's law. Take Ohm's law. It seems to be irrefutable. V=IR. It seems to be universal, even in space. Then there are "laws" that governments pass. They can be revoked, just as Moor'es "law." Law passed by governments are really more rules than laws. If it can be revoked, it's not a law in the first place.

User Rank
Re: I propose the opposite is true
Bert22306   8/1/2013 10:15:34 PM
Too many contradictions in these arguments, Tom. If Moore's Law continues to hold, it means that very soon everyone can afford these faster chips. Deep pockets or no. If Moore's Law stops or slows down, the faster technology will become expensive, available only to those with deep pockets. That seems to be what's happening already.

The effect on "national security" has to be that the richer a country is, the better off it is militarily *without* Moore's Law!!

Besides, I think that the term "national security threat" is being thrown about too loosely, used to justify way too many questionable things lately.

Page 1 / 6   >   >> Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)

What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.
Like Us on Facebook
Special Video Section
LED lighting is an important feature in today’s and future ...
The LT8602 has two high voltage buck regulators with an ...
The quality and reliability of Mill-Max's two-piece ...
Why the multicopter? It has every thing in it. 58 of ...
Security is important in all parts of the IoT chain, ...
Infineon explains their philosophy and why the multicopter ...
The LTC4282 Hot SwapTM controller allows a board to be ...
This video highlights the Zynq® UltraScale+™ MPSoC, and sho...
Homeowners may soon be able to store the energy generated ...
The LTC®6363 is a low power, low noise, fully differential ...
See the Virtex® UltraScale+™ FPGA with 32.75G backplane ...
Vincent Ching, applications engineer at Avago Technologies, ...
The LT®6375 is a unity-gain difference amplifier which ...
The LTC®4015 is a complete synchronous buck controller/ ...
The LTC®2983 measures a wide variety of temperature sensors ...
The LTC®3886 is a dual PolyPhase DC/DC synchronous ...
The LTC®2348-18 is an 18-bit, low noise 8-channel ...
The LT®3042 is a high performance low dropout linear ...