Will Faster Supercomputers Help Solve World’s Problems?

Over twice as fast as today’s most commonly used supercomputers, petascale processors—which are able to perform 1,000 trillion calculations per second—are slated to be released as early as 2008.

With the power of 100,000 desktop computers, the newest generation of supercomputers is predicted to have a profound impact in the fields of business and industry. Not only will stock brokers be able to use their processing power to better predict swings in the stock market, but the automotive industry will be able to use the higher processing speed to limit the need for building vehicle models—thus mitigating most defects before a prototype is built.

Advancements in supercomputing are also projected to aid in scientific discovery and speed up research. Experts maintain that as supercomputing technology advances, new models will be able to more quickly analyze massive data sets and test complex scientific models. Already supercomputers are used where experimentation is impossible or too dangerous, costly or time consuming.

In an interview with iTnews, David A. Bader, editor of the first book on petascale computing, said that scientists have high hopes for this next level of computing. He said that not only will the new machines make calculations easier, but they will be applied “to address our national and global priorities, such as sustainability of our natural environment by reducing our carbon footprint and by decreasing our dependencies on fossil fuels, improving human health and living conditions, understanding the mechanisms of life from molecules and systems to organisms and populations, preventing the spread of disease, predicting and tracking severe weather, recovering from natural and human-caused disasters, maintaining national security, and mastering nanotechnologies.”

But despite petascale computers’ ability to help solve mankind’s persistent problems, experts have already turned their sights on the next generation of supercomputers. Exascale computing, estimated to be released in 2018, will be capable of running one million trillion calculations per second.

https://rcg.org/realtruth/news/071213-002-science.html

David A. Bader
David A. Bader
Distinguished Professor and Director of the Institute for Data Science

David A. Bader is a Distinguished Professor in the Department of Computer Science at New Jersey Institute of Technology.