While supercomputers are thought to help accelerate engineering and scientific discovery, difficulties in programming the code they run on is increasingly becoming one of researchers� biggest productivity killers, according to a new study sponsored by Interactive Supercomputing, Inc.
The Development of Custom Parallel Computing Applications study conducted by the Simon Management Group surveyed more than 500 users of parallel high-performance computers (HPCs) from a range of industries including education, government, aerospace, healthcare, manufacturing, geo-sciences, bio-sciences and semiconductor. The report examines the software tools currently used, probes current application development environments, practices, and limitations, and catalogs critical issues and bottlenecks.
The study indicates that parallel code writing, programming efficiency, translation, debugging and limits of HPC software are the most frequently cited bottlenecks across all industries. Respondents indicated there is an urgent need to shorten the application development time of custom algorithms and models.
The largest category of respondents (42.3 percent) said that a typical project takes six months to complete, yet nearly 20 percent of respondents� projects consume two to three years of their time.
The majority of parallel application prototypes (65 percent) are developed in very high level languages (VHLLs) such as MATLAB, Mathematica, Python, and R. And while C and Fortran are frequently used to prototype, respondents overwhelmingly said they would prefer to work with an interactive desktop tool if the prototype could be easily bridged to work with HPC servers.
The disconnect stems from the fact that desktop computers cannot handle the processing and memory requirements of the huge amounts of data that many scientific and engineering problems analyze. And the problem is only getting worse. According to the study, the average median-sized data set used in a technical computing application today ranges from 10 to 45 gigabytes and is expected to swell to 200 to 600 gigabytes in just three years.
�This study demonstrates that programming tools have not kept pace with the advances in the computing hardware and affordability of high-performance computers,� said Peter Simon, president of Simon Management Group. �Technical computing users would prefer to continue working with their favorite desktop tools while tapping into the computing muscle of parallel systems. But there is clearly pain involved with re-programming their desktop models to run on parallel systems.
A copy of the report can be downloaded at:
About Interactive Supercomputing
Interactive Supercomputing (ISC) launched in 2004 to commercialize Star-P, an interactive parallel computing platform. With automatic parallelization and interactive execution of existing desktop simulation applications, Star-P merges two previously distinct environments � desktop computers and high performance servers � into one. Based in Waltham, Mass., the privately held company markets Star-P for a range of security, intelligence, manufacturing, energy, biomedical, financial, and scientific research applications. Additional information is available at http://www.interactivesupercomputing.com