Posted by Ken Farmer, Wednesday August 13 2008 @ 01:31PM EDT
TechNewsWorld: Supercomputers able to carry out enormous amounts of calculations per second have long been tied to some rather heady tasks -- playing humans in chess and modeling genomes, for instance. But as costs have fallen, businesses and government organizations have turned to supercomputers to solve all sorts of practical problems, from evaluating nuclear warheads without detonating them to designing the perfect potato chip.
Since the first supercomputers came online in the 1960s and '70s, they have earned a reputation as high-powered workhorses helping researchers conduct complex calculations.
Typically found at major universities and research facilities, the massive machines -- which at one time could occupy more than an acre of space in a data center Rackspace now offers green hosting solutions at the same cost without sacrificing performance. Make the eco-friendly choice. -- were often used in science: quantum mechanical physics, molecular modeling or mapping the human genome. Some jobs were less esoteric: IBM's Latest News about IBM Deep Blue earned fame in the chess world as an opponent of grand master Garry Kasparov.
Video - The Road to PetaFlop Computing
Explore the Scalable Unit concept where multiple clusters of various sizes can be rapidly built and deployed into production. This new architectural approach yields many subtle benefits to dramatically lower total cost of ownership.
White Paper - Optimized HPC Performance
Multi-core processors provide a unique set of challenges and opportunities for the HPC market. Discover MPI strategies for the Next-Generation Quad-Core Processors.
Appro and the Three National Laboratories
[Appro delivers a new breed of highly scalable, dynamic, reliable and effective Linux clusters to create the next generation of supercomputers for the National Laboratories.