Portland, Ore. -- Supercomputer simulations running in the trillions of operations per second provide a virtual test tube for the nanoparticles to be used in a new generation of magnetic media for tomorrow's hard-disk drives. To prove that point, researchers at the Pittsburgh Supercomputing Center and Oak Ridge National Laboratory recently modeled a promising nanoparticle material exhibiting a hitherto hidden capability that the researchers say could enable densities of 1 terabit/square inch.
Today even the most densely packed, 100-Gbit/inch2 magnetic media for disk drives derive their reliability from redundancy among as many as 100 randomly shaped magnetic grains with an average size of 10 nanometers. Not all the grains are perfectly magnetized, but since the bit is stored as a statistical average among all grains, their nonuniformity can be managed to provide reliable performance.
If that bit is shrunk to a single, 5-nm particle, however, how do quantum effects affect its magnetic characteristics, density and longevity? Supercomputer simulations run at the Pittsburgh Supercomputing Center in cooperation with Oak Ridge are supplying the answers for specific nanoparticles and are thus revealing which materials show sufficient promise to warrant prototyping. Thus far, iron platinum appears to have an edge because its self-isolating characteristics keep bits secure.
"Recently we modeled the magnetic moments of all 14,400 atoms in a nanoparticle of iron platinum with 1,200 processors, each assigned 12 atoms on our Cray XT3 supercomputer, which ran for 50 hours," said Yang Wang, a researcher at the Pittsburgh Supercomputing Center. "We found that this material has a 4-angstrom boundary region that isolates the nanoparticle from the surrounding alloy."
No one had previously predicted the boundary region, and its self-isolating characteristics have yet to be verified experimentally, but the researchers remain confident that their simulation has revealed the existence of a boundary whose constant, 4-angstrom diameter is independent of nanoparticle size. The boundary was the same diameter whether nanoparticles themselves were 5, 3.86 or 2.5 nm in diameter, making its discovery all the more important, according to Malcolm Stocks, a researcher at Oak Ridge who supervised Wang when the latter was a postdoctoral assistant there.
"This knowledge will prove useful," said Stocks. "Like a fishbowl, the surface of the particles separates the interior from outside quantum effects."
The software running the simulation is called the locally self-consistent multiple-scattering method, which Wang, Stocks and others at Oak Ridge developed a decade ago and which won the Gordon Bell Prize for the first teraops scientific application in 1998. On the Cray XT3 supercomputer, the quantum effects simulator ran at over 8 teraops, more than 80 percent of the XT3's theoretical peak.
Wang is designing his models to provide answers for questions about new nanoparticle formulations, including how much energy it takes to flip the magnetic moment from 0 to 1, and how small and close together particles can be without compromising long-term reliability.
"Magnetic-media density is at about 100 Gbits/square inch, but the superparamagnetic limit is looming; eventually the grain size will get so small that thermal fluctuations will disturb the magnetic moments," said Wang. "Terabit densities will probably use patterned magnetic media and spintronic effects, making supercomputer simulations increasingly important."
Previously, Wang and Stocks modeled the electronic and magnetic structure of an iron aluminide nanoparticle comprising 16,000 atoms. If all 2,048 processors on the Cray XT3 were utilized, Wang predicts, up to 100,000 atoms could be simulated, enabling future work to model not only single nanoparticles but also their interactions with adjacent bits.
According to Wang, simulating spintronic devices and patterned magnetic structures will require even faster supercomputers, since models will need to scale up to millions of atoms and therefore require 1 petaops (1,000 teraops) of supercomputer performance.