HPCwire Live! Atlanta’s Big Data Kick Off Week Meets HPC: What does the future holds for HPC?

Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta’s first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today’s big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?


Speech Transcript

Moderator: I am really pleased to introduce our next speaker tonight and she is Nicole Hemsoth. She’s the editor and chief of HPCwire and for those of you who don’t know HPCWire was started in 1986. It covers the fastest computers in the world and the people who run them. She is also former managing editor of Datanami and the theme really is big data research and development and how it affects companies big and small. I would like to invite Nicole up here to talk to you.

Nicole Hemsoth (Founder and Sr. Contributing Editor Datanami): Thanks a lot to the organizers for having me and for this event. This is very exciting, also thanks for the CDC for doing that presentation. I have a little bit of a confession to make when I was invited here it was to present on trends in big data so a sort of macro view of all the different applications, infrastructure systems and what not. I went through and made a list of who are the vendors are and such. I realized how boring that is first of all you can get that information from any analysis firm and secondly the trend in big data if there is any such word is really converging. So I changed my entire presentation at the last minute, all of this is handwritten in two hours or so and partially because I met with a number of city leaders and others who talked about what big data means to the city, what it means to research institutions, like projects you are working on at the high performing learning center and so its too multilevel to offer one cohesive sense or view.

What I do want to talk about is this trend of convergence and I want to start at the very top, which is where I generally live which is in high performance computing or supercomputing. Unless you are involved with it supercomputers sound like this esoteric thing, off limit technology that who actually uses supercomputing. In fact some of the most exciting data is from supercomputing so to bring this topic of convergence together, I’ll invite Dr. Bader back up to talk more about data intensive computing.

Dr. Bader: So high performance computing is as Nicole said use to be a niche of computing that where we design machines that would take up the size of this center. They would focus on niche problems from tracking severe storms, to securing our nuclear weapons, to designing energy efficient and safer cars. Now all of that super computing has been shrunk down to machines that either fit a cabinet, for example the size of your refrigerator at home or down into our laptops where we have technology such as multicore CPUs and GPUs. What has this done? It has created an exciting age where we have supercomputers that we can harness on our desktops and now as you know we have this data tsunami that we can process and mold together. So we’ve seen these machines that were once supercomputers really only accessible by the masses now being geared towards these data intensive problems. I know that is why many of you are here in the audience. I think it is quite an exciting time to look at these data intensive problems and found out how do these capabilities solve the real world challenges that we have day to day.

Nicole Hemsoth: Thank you very much Dr. Bader. Dr. Bader is very well known in high performance computing. This year in June, You and I will be in Germany for the conference where the fastest computer is expected to be the Titan supercomputer, which is clocking in at around 17 pads. You have this amazing amount of processing power but processing power when it comes to data isn’t the whole story so you have these giant machines that are designed for capacity, also storage, and memory has been optimized for big data. The convergence from supercomputing is trickling down to mainstream computing but at the same time enterprise and web 2.0 driven technology like Facebook and Google like map reduce are finding their way up into supercomputing. So for the first time supercomputing isn’t telling all of technology what to do on the high end, there is a bottom up democratization of technology in a totally new way and it’s totally exciting.

Dr. Bader: I totally agree and we are seeing it at these exhibits like supercomputing shows where it is not only countries trying to harness these computers but businesses both large and small try to keep our credit transactions safe, how to make it a smarter planet, how to really go about our daily lives in a much more efficient way. It’s really a new set of problems that we have to solve.

Nicole Hemsoth: Again, it’s the reason why I changed my presentation again last night was listening to people who live in the computing world at different levels of the spectrum. People who work at Georgia Tech, people who work on some high level cloud systems, massive servers, and host applications, people who have smaller startups that do consumer services. One of the things that struck me was that Atlanta represents that pyramid from the highest level of computer use to that middle layer where fortune 500 companies exist to at the bottom where start ups use some of that new technology that uses mobile application. Atlanta serves all ends of computing where the highest levels and smallest levels of computing converge. I know that you are working on some really neat applications many that you cannot talk about, what do you think best represents a data intensive application? How it represents both of those worlds.

Dr. Bader: We do a lot of research here at GA Tech some of their projects support National Science Foundation endeavors in trying to understanding biology GENOME, other projects are geared towards trying to understand how social networks change over time so I think these types of applications are really a new view into a new application space and we have data sets available to sift against the machines that are coming out of IBM, Cray, Microprocessor designers, Intel and Nvidia. All have new technologies that allow us to get new information from that data.

Nicole Hemsoth: Absolutely, I imagine that there may be a few questions. We may take two or three. Keep it focused on the highest level of computers and where it is filtering down to servers, cloud service companies and Telco’s. I am happy to field them.

That’s an interested point so I believe that commodity vanilla hardware is the wave of the future and that is boring to an HPC person who gets bored over accelerators and GPUs but unfortunately we are moving towards these massive data centers that are basic community clusters that will be able to handle a broad range of applications because a lot of them are built on open source hardware/ software stack, all integrated stack that you can do anything with from mobile to advance life science applications, financial applications all on the same hardware. Hosted or not.

Dr. Bader: A little bit future technology we are going to see a shift in the upcoming year are systems that are now more energy efficient. Moving everything up close to the processor and further away from the disk which gives us more energy efficiency and if we can reduce that memory so that that data and there are new memory technology so my view is that we look at 3 to 5 years. Keep an eye on what is happening on high bird memory consortium and other groups that are looking at how memory systems change and I think that it will greatly impact the big data solutions we see– as I said, 3 to 5 years from now.

Nicole Hemsoth: I think to give a little bit of context on efficiency; this is the missing part of the conversation on big data. We just talked about the Cray Titan, we are talking about petascale the next step up is xscale. At this point with current technology even if you factor in 3d memory you are looking at needing a nuclear powered center to run these data centers. Clearly something needs to change.

So these commercialization and privatization of HPC, I can point some awesome stuff if I had a Cray so I do believe that it is a long way away. One of the cool parts of that is that if we all have our cabinet in our kitchen or wherever we want to keep it, we can re-route the system so that it can warm your home. There is a well-known study where people were using their own in home data center to heat their home. There are ways to work that in there are ways to work that in. The affordability is there you can get an Intel 5 for an affordable price and put together your own research center in your home or turn to amazon or rent out some high level hardware in the cloud.

Dr. Bader: Maybe we have time for one more question.

Nicole Hemsoth: So HPC there are always debates about how you define HPC: do you define it by the applications, by the pure floating point performance and the way HPCWire as a magazine looks at it what is the purpose is, what degree is performance critical there. For a lot of big data applications it is a bad thing and performance would be perfect and despite every vendor telling you it is real time, this is probably not the case so there’s some merging of technology that needs to happen before they come together.

Dr. Bader: It’s a moving target what I do is refer you to an analysis called the IDC that tracks the market and there’s some movement as to what defines an HPC resource vs. Data resource. I would point you there. When we talk about HPC we are normally talking about bleeding systems in computer technology and if you look at HPC what was a supercomputer one year and wait 7 to 10 years it would be the laptop on your desk. Think about it another way, how are you using your computer that you have maybe your iPod if you are to go back in time 10 years you would have had a super computer. So that’s the thought experiment what can you do today with the capabilities you have on your desk.

Nicole Hemsoth: In terms of convergence HPC in big data pushing together are building that foundation for the new layer of supercomputers that will eventually be the next generation. That is why having this pure from the ground up where the top and bottom are working together to created this solid foundation technology is critical and such an exciting field. All aside big data big data is exciting for what it means for everyone.

Dr. Bader: Thank you Nicole and thank you for listening to those presentations.


https://bit.ly/30EeycJ

David A. Bader
David A. Bader
Distinguished Professor and Director of the Institute for Data Science

David A. Bader is a Distinguished Professor in the Department of Computer Science at New Jersey Institute of Technology.