Petascale computers: the next supercomputing wave

By Liz Tay

The author of the world’s first published collection on petascale techniques, David A. Bader, discusses petascale, exascale and the future of computing.

Supercomputing has come a long way in the past half-century. Far from CDC’s single-operation scalar processors in the 1960s, present day terascale computers in development by companies like Intel boast up to 100 processor cores and the ability to perform one trillion operations per second.

Now, academics have turned their attention to petascale computers that are said to be capable of performing one quadrillion – that’s one million billion – operations per second. Running at nearly ten times the speed of today’s fastest supercomputers, petascale computing is expected to open the doors to solving global challenges such as environmental sustainability, disease prevention, and disaster recovery.

“Petascale Computing: Algorithms and Applications” was launched this month as the world’s first published collection on petascale techniques for computational science and engineering. Edited by David A. Bader, associate professor of computing and executive director of high-performance computing at Georgia Tech, the book is geared towards generating new discussion about the future of computing.

To gain a little insight into petascale technologies, iTnews spoke with Bader about high performance computing and the future.

What is petascale computing?

Petascale Computing is the present state-of-the-art in High Performance Computing that leverages the most cutting edge large-scale resources to solve grand challenge problems in science and engineering.

What will be the primary use of petascale computers?

Science has withstood centuries of challenges by building upon the community’s collective wisdom and knowledge through theory and experiment. However, in the past half-century, the research community has implicitly accepted a fundamental change to the scientific method.

In addition to theory and experiment, computation is often cited as the third pillar as a means for scientific discovery.

Computational science enables us to investigate phenomena where economics or constraints preclude experimentation, evaluate complex models and manage massive data volumes, model processes across interdisciplinary boundaries, and transform business and engineering practices.

How significant will the development of Petascale Computing be to the advancement of science and technology?

Increasingly, cyber-infrastructure is required to address our national and global priorities, such as sustainability of our natural environment by reducing our carbon footprint and by decreasing our dependencies on fossil fuels, improving human health and living conditions, understanding the mechanisms of life from molecules and systems to organisms and populations, preventing the spread of disease, predicting and tracking severe weather, recovering from natural and human-caused disasters, maintaining national security, and mastering nanotechnologies.

Several of our most fundamental intellectual questions also require computation, such as the formation of the universe, the evolution of life, and the properties of matter.

What are the challenges faced by Petascale Computing?

While petascale architectures certainly will be held as magnificent feats of engineering skill, the community anticipates an even harder challenge in scaling up algorithms and applications for these leadership-class supercomputing systems.

Several areas are important for this task: scalable algorithm design for massive concurrency, computational science and engineering applications, petascale tools, programming methodologies, performance analyses, and scientific visualisation.

High end simulation is a tool for computational science and engineering applications. To be useful tools for science, such simulations must be based on accurate mathematical descriptions of the processes and thus they begin with mathematical formulations, such as partial differential equations, integral equations, graph-theoretic, or combinatorial optimisation.

Because of the ever-growing complexity of scientific and engineering problems, computational needs continue to increase rapidly. But most of the currently available hardware, software, systems, and algorithms are primarily focused on business applications or smaller scale scientific and engineering problems, and cannot meet the high-end computing needs of cutting-edge scientific and engineering work.

This book [“Petascale Computing: Algorithms and Applications”] primarily addresses the concerns of petascale scientific applications, which are highly compute- and data-intensive, cannot be satisfied in today’s typical cluster environment, and tax even the largest available supercomputer.

When can we expect petascale computers, and who from?

Realising that cyber infrastructure is essential to research innovation and competitiveness, several nations are now in [what has been called] a ’new arms race to build the world’s mightiest computer'.

These petascale computers, expected around 2008 to 2012, will perform ten to the power of 15 operations per second, nearly an order of magnitude faster than today’s speediest supercomputer.

In fact, several nations are in a worldwide race to deliver high-performance computing systems that can achieve 10 petaflops or more within the next five years. Several leading computing vendors, such as IBM, Cray, Dawning, NEC, Sun Microsystems, and others, are in a race to deploy petascale computing systems.

How do you think technology will progress in 2008 and beyond? What is likely to be the next set of challenges we will face?

We expect to see the first peak petascale systems in 2008, and sustained petascale systems soon thereafter. Many grand challenge investigations in computational science and engineering will be solved by these magnificent systems.

Already, discussions are taking place on the complex applications that will require the next generations of supercomputers: exascale systems that are 1000 times more capable than these petascale systems.

Certainly amazing technological innovations and application development will be necessary to make exascale computing a reality.

Yours is the first book on petascale techniques to ever have been published. What do you hope it will achieve?

My goal in developing this book was to inspire members of the high-performance computing community to solve computational grand challenges that will help our society, protect our environment, and improve our understanding in fundamental ways, all through the efficient use of petascale computing.

This is the premier book that captures the first wave of applications anticipated to run on these petascale computers. Already, international interest in this milestone book is very high, since it is the first book ever to discuss applications of petascale computing.

This book in petascale computing will be a resource for training these [future] generations of students and researchers in how to leverage state-of-the-art high performance computing systems to solve grand challenges of national and global priority, such as sustainability of our natural environment by reducing our carbon footprint and by decreasing our dependencies on fossil fuels, improving human health and living conditions, understanding the mechanisms of life from molecules and systems to organisms and populations, preventing the spread of disease, predicting and tracking severe weather, recovering from natural and human-caused disasters, maintaining national security, and mastering nanotechnologies.

https://www.itnews.com.au/feature/petascale-computers-the-next-supercomputing-wave-98316

David A. Bader
David A. Bader
Distinguished Professor and Director of the Institute for Data Science

David A. Bader is a Distinguished Professor in the Department of Computer Science at New Jersey Institute of Technology.