PORTLAND, Ore. Cluster supercomputers have become a commodity that increasing numbers of engineers and scientists use remotely, but remote graphics tools have not kept pace.
To address the problem, the National Science Foundation (NSF) has awarded $7 million to the Texas Advanced Computing Center (TACC) to advance its "Longhorn" project to build a comprehensive suite of remotely accessed graphics visualization and data analysis tools.
Longhorn will have 256 nodes, using 2,048 Intel Nehalem cores plus 128 Nvidia Quadroplex cores for a total of 512 graphics processing units. A total of 48 Gbytes (6 gigabytes per core) will be shared by most nodes, except for 12 high-density nodes that will each house 144 Gbytes of shared memory.
According to Kelly Gaither, principal investigator and director of TACC's data and information analysis unit, "We are specifically targeting large-scale systems that have a need for graphics visualization and data analysis capabilities--from global weather prediction to cosmology."
The three-year project aims to provide interactive remote graphics visualization and analysis tools capable of handling datasets on a petabyte scale (or 1 million gigabytes). Such giant data sets have become routine in an expanding set of applications in science, engineering, medicine, national security and public safety, according to NSF's Office of Cyberinfrastructure.
The goal is to provide peak performance of 20.7 teraflops from its CPUs, 500 teraflops peak performance from its GPUs, and a total peak rendering performance of 154 billion triangles per second. Memory will total 13.5 terabytes of semiconductor memory and 210 terabytes of disk space.