MILPITAS, Calif. — While businesses turn to proven systems for their high-performance computing needs, research institutions are more willing to experiment and take a chance on the latest and greatest to solve complex problems.
The Japan Agency for Marine-Earth Science and Technology (JAMSTEC) is the perfect example: It recently selected SGI's large-scale shared memory system, the UV 2000, for installation at its Earth Simulator supercomputer center. The supercomputer is integrated with Intel's newly launched Xeon processor E5-4600 v2. Bill Mannel, general manager for compute servers at SGI, said the new processor fits into SGI's UV 2000 architecture:
In Intel's "Tick/Tock" model, every "tock" microarchitectural change Intel makes is followed by a "tick" die shrink (which is an increase in transistor density enabling new capabilities, higher performance levels, and greater energy efficiency within a smaller, more capable version of the previous "tock" micro architecture). The Intel Xeon E5-4600 v2 is a "tick" in Intel's innovation cycle, so there is no microarchitecture change (that typically involves extra cores) and no significant step-function relative to performance.
Mannel describes the deployment as a "mileage may vary" situation. Some applications will enjoy using the extra cores and get more work done. In other cases, processing power is not the bottleneck, so additional cores will not make a difference.
The SGI UV 2000, deployed at JAMSTEC, is one of the largest shared memory systems powered by 2,560 cores of Intel Xeon processor E5-4600 v2 series, and provides 49.152 Tflops of computing capacity with 32TB shared memory for a single system instance. It can operate at up to 64 terabytes of memory and is built on industry-standard hardware and software. Configurations can start as small as just four sockets, and can grow by adding blades while maintaining the correct balance of compute, memory, and IO networking or storage capability.
Bob Braham, SGI's chief marketing officer, said JAMSTEC is an example of a customer that's less worried about total cost of ownership and more about pushing the limits of in-memory systems to solve problems as part of the knowledge discovery process. Although researchers aren't computer designers, they tend to give useful feedback, and they are willing to work on the bleeding edge since the applications aren't mission critical. On the other hand, enterprises expect these systems to have been shaken out by the time they deploy them.
JAMSTEC's Earth Simulator supercomputer will be used for earth science research that requires significant computational resources to run high-resolution and high-precision numerical simulations. Core research projects that will leverage the system include the Ocean Climate Change Research Program, Tropical Climate Variability Research Program, and Advanced Atmosphere-Ocean-Land Modeling Program. The agency also expects the specialized capability of the UV 2000 to attract use by industry scientists through strategic partnerships, including those from the automotive, pharmaceutical, and chemical industries.
High-performance computing used to be the sole domain of research, but enterprises now have their own big-data challenges as well as online transaction processing requirements that can take advantage of in-memory systems.
Steve Conway, IDC VP, high-performance computing and data analysis, said HPC problems can be split into two broad categories: those that can be partitioned and those that can't. Business processes, such as running payroll operations, are easily partitioned. The ones that can't tend to be higher-value problems. "The processors and cores have to communicate with each other while the problem is being solved."
Having a single memory space improves performance, and, more importantly to the problem-solving market segment, improves accuracy, said Conway. He noted that SGI has established itself as a provider of systems with large, single global memory spaces, with fast communication between processors and memory.
He said scientific computing is much more risk tolerant than business computing, and these bleeding-edge technologies are adopted quickly before trickling in the enterprise space.
Braham said today's high-performance computing is ultimately tomorrow's enterprise computing and that SGI's UV2000 has applications for business grappling with big data and supporting in-memory computing systems such as SAP HANA.