The University of Texas at Austin’s Texas Advanced Computing Center will host and manage one of the world’s most powerful computers with a $59 million, five-year grant from the National Science Foundation (NSF), the largest single NSF grant in the university’s history.
The computer will significantly increase the computing power and time available to academic researchers around the country who conduct research on subjects ranging from the birth of the universe to the workings of molecules inside the body.
The NSF grant will pay for the acquisition of the computer and its operation.
The Texas Advanced Computing Center (TACC) is teaming with Sun Microsystems, Advanced Micro Devices Inc., the Cornell Theory Center at Cornell University and the Fulton High Performance Computing Institute at Arizona State University for the project.
The NSF awarded the grant to The University of Texas at Austin through the competitive High Performance Computing System Acquisition Program. The program is designed to deploy and support a world-class high performance computing system of unprecedented capacity and capability to empower the U.S. academic research community. The computer will be a part of the TeraGrid, an NSF-sponsored network of high performance computers.
“This is a very valuable resource for the scientific community and society in general,” said William Powers Jr., president of the university. “This award confirms that The University of Texas at Austin is a innovative leader in high performance computing and research.”
Scientists, engineers and other researchers from a variety of disciplines use high performance systems, also called supercomputers, to test and validate theories, analyze experiments and conduct experiments that would otherwise be impossible.
Juan Sanchez, vice president for research at the university said the new supercomputer will enable a new wave of research and researchers.
“The Texas Advanced Computing Center is highly qualified to manage this powerful system, which will have a deep impact on science,” Sanchez said. “The scale of the hardware and its scientific potential will influence technology research and development in many areas, and the results and possibilities will contribute to increasing public awareness of high performance computing. In addition, the project team is deeply committed to training the next generation of researchers for using HPC resources.”
The computer is expected to run at a peak performance of more than 400 trillion floating point operations per second (that’s 400 followed by 12 zeroes) when it goes into full production in 2008. It is expected to be the most powerful general-purpose system for open research in the world.
“The new Sun system will provide unmatched capability and capacity for scientific discovery for the open research community,” said Jay Boisseau, director of TACC and the principal investigator for the project. “The technologies in the new Sun systems will enable unprecedented performance on important science problems.”
“With tremendous and balanced processor, memory, disk and interconnect capabilities, this powerful system will enable both numerically intensive and large-scale data applications in many scientific disciplines,” said Tommy Minyard, assistant director for advanced computational systems at TACC and the team project manager.
Under the agreement with the NSF, five percent of the computer’s processing time will be allocated to an industrial partners program for technology transfer and another 5 percent will be allocated to other Texas academic institutions.
“This resource will help Texas academic researchers provide answers to some of the perplexing scientific questions,” said Mark G. Yudof, chancellor of the University of Texas System. “And it will give the industrial partners access to powerful tools to help them transform those answers into products and services that will benefit society and strengthen our economy.”
Working with the machine could help scientists make better predictions on the weather, hurricanes and earthquakes as well as lead them closer to answers about supernovae, black holes and protein structures.
Dr. Omar Ghattas, director of the Center for Computational Geosciences at the Institute for Computational Engineering and Sciences at the university, said the power of the computer would help solve problems beyond the reach of current systems.
He is working to trace the effects of earthquakes back to their sources, which is called an inverse problem.
“The enormous leap in performance offered by this system will allow solution of inverse problems across many science and engineering areas for which forward simulation alone requires the full resources of current systems,” he said. “This will facilitate uncertainty quantification for large-scale simulations, which is crucial for forecasting, prediction and decision-making.”
Sun is building the computer using Sun Fire(TM) x64 (x86, 64-bit) servers powered by Next-Generation AMD Opteron(TM) processors. The system will be upgradeable to native Quad-Core AMD Opteron processors as they become available in mid-2007. Sun will also provide more than 100 trillion bytes (terabytes) of memory and 1.7 quadrillion bytes (petabytes) of storage with Sun StorageTek(TM) disk and tape storage technologies.