Breaking News
Blog

Why Does the Nobel Prize Keep Forgetting Memory?

NO RATINGS
View Comments: Newest First | Oldest First | Threaded View
Kristin Lewotsky
User Rank
Author
Re: Don't forget...
Kristin Lewotsky   7/3/2013 4:53:27 AM
NO RATINGS
In some ways, the bigger question is one of relevance. Nobel called out physics in his will most likely because it was considered most prestigious and valuable in his lifetime, during which major discoveries had come second fast. That is not to say that we've made all the major discoveries – obviously not – but in an increasingly technological society, the contributions of engineers have the potential to more profoundly impact our world. By having a separate honor for advanced technology, the Kyoto Prize program has made itself more relevant. Meanwhile, in the 60s, the Nobel organization ruled that it would add no more categories, so unless the various committees shoehorn something in, the program will increasingly ignore key advances that are changing the fabric of our lives.

Kristin Lewotsky
User Rank
Author
Re: Don't forget...
Kristin Lewotsky   7/3/2013 4:40:27 AM
NO RATINGS
True, but on the other hand, Geim and Novoselov won the 2010 Nobel Prize in Physics for isolating graphene, even though the material had been theorized about since 1947 and at least one other group had reported using the Scotch tape method as early as 1999. Granted, part of the Geim/Novoselov  work had to do with characterization, but much of it was simply about isolating the material. Basically, they took the concept and turned it into reality. Conversely, the 2009 prize to Smith and Boyle for the CCD imager spawned some rather unbecoming controversy because it was felt in some quarters that while the two may have developed the concept, others were responsible for extending it to imaging, as named in the prize. The language of Alfred Nobel's will says to award prizes to those who, "...have conferred the greatest benefit to mankind...one part to the person who shall have made the most important discovery or invention within the field of physics..." Maybe Atanasoff made the discovery but Dennard was responsible for the invention, including the great benefit part.

Michael Dunn
User Rank
Author
Don't forget...
Michael Dunn   7/2/2013 8:19:42 PM
NO RATINGS
Honestly, I question if a Nobel for DRAM isn't stretching things a bit. After all, the basic concept dates back to at least Atanasoff's machine from the late 1930s...

Most Recent Comments
michigan0
 
SteveHarris0
 
realjjj
 
SteveHarris0
 
SteveHarris0
 
VicVat
 
Les_Slater
 
SSDWEM
 
witeken
Most Recent Messages
9/25/2016
4:48:30 PM
michigan0 Sang Kim First, 28nm bulk is in volume manufacturing for several years by the major semiconductor companies but not 28nm FDSOI today yet. Why not? Simply because unlike 28nm bulk the LDD(Lightly Doped Drain) to minimize hot carrier generation can't be implemented in 28nm FDSOI. Furthermore, hot carrier reliability becomes worse with scaling, That is the major reason why 28nm FDSOI is not manufacturable today and will not be. Second, how can you suppress the leakage currents from such ultra short 7nm due to the short channel effects? How thin SOI thickness is required to prevent punch-through of un-dopped 7nm FDSOI? Possibly less than 4nm. Depositing such an ultra thin film less then 4nm filum uniformly and reliably over 12" wafers at the manufacturing line is extremely difficult or not even manufacturable. If not manufacturable, the 7nm FDSOI debate is over!Third, what happens when hot carriers are generated near the drain at normal operation of 7nm FDSOI? Electrons go to the positively biased drain with no harm but where the holes to go? The holes can't go to the substrate because of the thin BOX layer. Some holes may become trapped at the BOX layer causing Vt shift. However, the vast majority of holes drift through the the un-dopped SOI channel toward the N+Source,...

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)
Like Us on Facebook
EE Times on Twitter
EE Times Twitter Feed