Posts

Nvidia Analysts See Up To Five Times Return On $100 Billion OpenAI Deal. Is Nvidia A Buy Now?

By Vidya Ramakrishnan

Nvidia stock (NVDA) touched an all-time high in intraday trade Monday after the company said it would invest up to $100 billion in ChatGPT developer OpenAI. Nvidia shares fell to test their 50-day moving average on Wednesday. Is Nvidia stock a buy or sell now?

Nvidia Analysts See Up To Five Times Return On $100 Billion OpenAI Deal. Is Nvidia A Buy Now?
Harnessing the Power of LLMs for Software Performance Engineering

By David A. Bader

As someone who has spent decades optimizing parallel algorithms and wrestling with the complexities of high-performance computing, I’m constantly amazed by how the landscape of performance engineering continues to evolve. Today, I want to share an exciting development in our Fastcode initiative: leveraging Large Language Models (LLMs) as sophisticated analysis tools for software performance optimization.

Harnessing the Power of LLMs for Software Performance Engineering
AI breakthroughs spur race for superintelligence

By Swathi Moorthy, ETtech

Almost three years after the launch of OpenAI’s GPT3.5 in November 2022, the techno optimists would say that the world of artificial intelligence is closer to superintelligence than ever before.

AI breakthroughs spur race for superintelligence
David Bader Selected Among '35 Legends in the Class of 2025' by HPCwire

Written by: Michael Giorgio

Distinguished Professor David Bader in the Ying Wu College of Computing’s Department of Data Science has been recognized among an elite roster of High-Performance Computing (HPC) pioneers in the HPCwire 35 Legends Class of 2025 list. The honorees are selected annually based on contributions to the HPC community over the past 35 years that have been instrumental in improving the quality of life on our planet through technology.

David Bader Selected Among '35 Legends in the Class of 2025' by HPCwire
MIT Study Finds ChatGPT Can Harm Critical Thinking Over Time

By John P. Mello Jr.

A recent study by the Media Lab at MIT found that prolonged use of ChatGPT, a large language model (LLM) chatbot, can have a harmful impact on the cognitive abilities of its users.

MIT Study Finds ChatGPT Can Harm Critical Thinking Over Time
IBM’s New Quantum Roadmap Brings the Bitcoin Threat Closer

By Jason Nelson

Quantum computers weren’t expected to pose a threat to Bitcoin’s security anytime soon. But IBM has launched a project that could expedite the timeline: the world’s first fault-tolerant quantum computer, set to debut by 2029.

IBM’s New Quantum Roadmap Brings the Bitcoin Threat Closer
A Billionaire Emerges As Key Nvidia Partner, With A New Lab Based In NJ

By Elizabeth MacBride, Senior Contributor. Business Journalist

Thai Lee speaks at the recent opening of the AI Labs in New Jersey. *SHI*
Thai Lee speaks at the recent opening of the AI Labs in New Jersey. SHI

Nvidia is opening another distribution strategy for its GPU capacity, by establishing a network of IT consulting firms as partners to test and sell AI solutions. The partners use new Spark machines and sell Nvidia infrastructure to enterprises.

A Billionaire Emerges As Key Nvidia Partner, With A New Lab Based In NJ
Bader receives the 2025 Heatherington Award for Technological Innovation
Congratulations to the 2025 Hall of Fame inductees at Mimms Museum of Technology and Art (formerly Computer Museum of America). Executive Director, Rena Youngblood had the honor of presenting each person with their award at the BYTE25 annual fundraiser held Thursday, March 6th. Thank you David A. Bader, PhD, Dan Bricklin, and John Yates for joining us. This world would be a different place without you—your innovations, your willingness to collaborate, and your commitment to paying it forward.
Congratulations to the 2025 Hall of Fame inductees at Mimms Museum of Technology and Art (formerly Computer Museum of America). Executive Director, Rena Youngblood had the honor of presenting each person with their award at the BYTE25 annual fundraiser held Thursday, March 6th. Thank you David A. Bader, PhD, Dan Bricklin, and John Yates for joining us. This world would be a different place without you—your innovations, your willingness to collaborate, and your commitment to paying it forward.
Bader receives the Heatherington Award for Technological Innovation at the Mimms Museum for Technology and Art, 6 March 2025.
Bader receives the Heatherington Award for Technological Innovation at the Mimms Museum for Technology and Art, 6 March 2025.

Revolutionized the computing industry through groundbreaking innovations that democratized High-Performance Computing. Designed the first commodity-based supercomputer and prototyped a system using off-the-shelf components and a novel high-speed interconnection network, leading to “RoadRunner,” the first Linux supercomputer for open use by the national science and engineering community. Made seminal contributions to parallel computing software and pioneered general-purpose computing on accelerators. Led the team whose work was by used IBM in the first pre-assembled and configured Linux server clusters for business. Made significant research contributions in the field of novel parallel graph algorithms. Founded and chaired the School of Computational Science and Engineering at Georgia Tech.

Bader receives the 2025 Heatherington Award for Technological Innovation
Computer Museum of America Announces 2025 Hall of Fame Inductees
David Bader at the Hall of Fame ceremony, 6 March 2025.
David Bader at the Hall of Fame ceremony, 6 March 2025.

ROSWELL, Ga. – (February 4, 2025) – Computer Museum of America (CMoA), a metro Atlanta attraction featuring one of the world’s largest collections of digital-age artifacts, today announced the induction of three new distinct members to its Hall of Fame. The trailblazing pioneers will be honored on March 6 during BYTE25, the museum’s largest fundraiser of the year.

Computer Museum of America Announces 2025 Hall of Fame Inductees
DeepSeek has called into question Big AI’s trillion-dollar assumption
*[Photo: Patrick Pleul/picture alliance via Getty Images]*
[Photo: Patrick Pleul/picture alliance via Getty Images]

By Mark Sullivan

Recently, Chinese startup DeepSeek created state-of-the art AI models using far less computing power and capital than anyone thought possible. It then showed its work in published research papers and by allowing its models to explain the reasoning process that led to this answer or that. It also scored at or near the top in a range of benchmark tests, besting OpenAI models in several skill areas. The surprising work seems to have let some of the air out of the AI industry’s main assumption—that the best way to make models smarter is by giving them more computing power, so that the AI lab with the most Nvidia chips will have the best models and shortest route to artificial general intelligence (AGI—which refers to AI that’s better than humans at most tasks).

DeepSeek has called into question Big AI’s trillion-dollar assumption
7 Questions for David Bader: Graph Analytics at Scale with Arkouda and Chapel

By: Engin Kayraklioglu, Brad Chamberlain

In this installment of our 7 Questions for Chapel Users series, we welcome David Bader, a Distinguished Professor in the Ying Wu College of Computing at the New Jersey Institute of Technology (NJIT). With a deep focus on high-performance computing and data science, David has consistently driven innovation in solving some of the most complex and large-scale computational problems. Read on to dive into his journey with Chapel, his current projects, and how tools like Arkouda and Arachne are accelerating data science at scale.

7 Questions for David Bader: Graph Analytics at Scale with Arkouda and Chapel
Amid an A.I. Chip Shortage, the GPU Rental Market Is Booming

By Aaron Mok

GPU rentals allow small companies to access high-performance A.I. chips for specific projects. *Igor Omilaev/Unsplash*
GPU rentals allow small companies to access high-performance A.I. chips for specific projects. Igor Omilaev/Unsplash

GPUs, or graphic processing units, have become increasingly difficult to acquire as tech giants like OpenAI and Meta purchase mountains of them to power A.I. models. Amid an ongoing chip shortage, a crop of startups are stepping up to increase access to the highly sought-after A.I. chips—by renting them out.

Amid an A.I. Chip Shortage, the GPU Rental Market Is Booming
Big Tech Goes Nuclear To Quench AI Energy Demands

By Jon Swartz

Big Tech is going nuclear in an escalating race to meet growing energy demands.

Amazon.com Inc., Alphabet Inc.’s Google and Microsoft Corp. are pouring billions of dollars into nuclear energy facilities to supply companies with emissions-free electricity to feed their artificial intelligence services.

Big Tech Goes Nuclear To Quench AI Energy Demands
Devastation from Hurricane Helene could bring semiconductor chipmaking to a halt

By Clare Duffy and Dianne Gallagher, CNN

Aftermath of Hurricane Helene seen near Spruce Pine, North Carolina, which supplies much of the world's high-purity quartz for semiconductor manufacturing. *Courtesy Dr. Barbara A. Stagg*
Aftermath of Hurricane Helene seen near Spruce Pine, North Carolina, which supplies much of the world’s high-purity quartz for semiconductor manufacturing. Courtesy Dr. Barbara A. Stagg

New York, CNN — The devastation in North Carolina in the wake of Hurricane Helene could have serious implications for a niche — but extremely important — corner of the tech industry.

Devastation from Hurricane Helene could bring semiconductor chipmaking to a halt
Why did Delta take days to restore normal service after CrowdStrike outage? Experts weigh in.

By Max Zahn

An outage caused by a software update distributed by cybersecurity firm CrowdStrike triggered a wave of flight cancellations at several major U.S. airlines – but the disruption was most severe and prolonged at Delta Airlines.

Why did Delta take days to restore normal service after CrowdStrike outage? Experts weigh in.
Here's what the CrowdStrike outage exposed about our connected world. It's not good.

Nearly a week after a massive IT outage shut down computer systems around the world, cybersecurity company CrowdStrike (CRWD) issued a statement Thursday revealing that a single software update was responsible for grounding planes, curtailing hospital procedures, and closing businesses for days.

Here's what the CrowdStrike outage exposed about our connected world. It's not good.
Ticketmaster hacked: Here's how experts say New Jerseyans can protect their info

by Daniel Munoz, NorthJersey.com

More than half a billion customers at the event ticketing company Ticketmaster had their personal information hacked and potentially sold on the dark web last month, but security experts say there are measures you can take to minimize the impact.

Ticketmaster hacked: Here's how experts say New Jerseyans can protect their info
The Taiwan earthquake is a stark reminder of the risks to the region’s chipmaking industry

By Clare Duffy, CNN

Emergency personnel stand in front of a partially collapsed building leaning over a street in Hualien on April 3, 2024, after a major earthquake hit Taiwan's east coast. Sam Yeh/AFP/Getty Images
Emergency personnel stand in front of a partially collapsed building leaning over a street in Hualien on April 3, 2024, after a major earthquake hit Taiwan’s east coast. Sam Yeh/AFP/Getty Images

New York (CNN) – The world’s biggest chipmaker is working to resume operations following the massive earthquake that struck Taiwan Wednesday — a welcome sign for makers of products ranging from iPhones and computers to cars and washing machines that rely on advanced semiconductors.

The Taiwan earthquake is a stark reminder of the risks to the region’s chipmaking industry
To The Point Cybersecurity Podcast: The Democratization of Data Science Tools with Dr. David Bader

Listen

About This Episode

Joining us this week is Dr. David Bader, a Distinguished Professor and founder of the Department of Data Science in the Ying Wu College of Computing and Director of the Institute for Data Science at the New Jersey Institute of Technology. He deep dives into the opportunity to democratize data science tools and the awesome free tool he and Mike Merrill spent the last several years building that can be found on the Bears-R-Us GitHub page open to the public.

To The Point Cybersecurity Podcast: The Democratization of Data Science Tools with Dr. David Bader
What CISOs need to know to mitigate quantum computing risks

By David Bader

Image via Unsplash
Image via Unsplash

Quantum technologies harness the laws of quantum mechanics to solve complex problems beyond the capabilities of classical computers. Although quantum computing can one day lead to positive and transformative solutions for complex global issues, the development of these technologies also poses a significant and emerging threat to cybersecurity infrastructure for organizations.

What CISOs need to know to mitigate quantum computing risks
This New AI Brain Decoder Could Be A Privacy Nightmare, Experts Say

By Sascha Brodsky

Artificial intelligence (AI) lets researchers interpret human thoughts, and the technology is sparking privacy concerns. The new system can translate a person’s brain activity while listening to a story into a continuous stream of text. It’s meant to help people who can’t speak, such as those debilitated by strokes, to communicate. But there are concerns that the same techniques could one day be used to invade thoughts.

This New AI Brain Decoder Could Be A Privacy Nightmare, Experts Say
Should organizations swear off open-source software altogether?

Written by Tim Keary

Image Credit: Shutterstock
Image Credit: Shutterstock

Open-source software is a nightmare for data security. According to Synopsys, while 96% of software programs contain some kind of open-source software component, 84% of codebases contain at least one vulnerability.

Should organizations swear off open-source software altogether?
Will computing advances mean the end of digital privacy?

By: Martin Daks

An emerging technology harnesses the laws of quantum mechanics to solve problems too complex for classical computers. Quantum computing is expected to shatter barriers and turbocharge processes, from drug discovery to financial portfolio management. But this revolutionary new approach may also give hackers the ability to crack open just about any kind of digital “safe,” giving them access to trade secrets, sensitive communications and other mission-critical data. Last year, the threat prompted President Joe Biden to sign a national security memorandum, “Promoting United States Leadership in Quantum Computing While Mitigating Risks to Vulnerable Cryptographic Systems,” directing federal agencies to migrate vulnerable cryptographic systems to quantum-resistant cryptography. We spoke with some cybersecurity experts to find out what’s ahead.

Will computing advances mean the end of digital privacy?
‘Weaponised app’: Is Egypt spying on COP27 delegates’ phones?

By Beatrice Zemelyte

Cybersecurity concerns have been raised at the United Nations’ COP27 climate talks over an official smartphone app that reportedly has carte blanche to monitor locations, private conversations and photographs.

‘Weaponised app’: Is Egypt spying on COP27 delegates’ phones?
ECE Launches 14th Year of Booz Allen Hamilton Colloquium Series

The Fall 2022 Booz Allen Hamilton Colloquium Series kicked off its 14th year last Friday. The first talk of the semester was given by Dr. Tim O’Shea, Chief Technology Officer of DeepSig, and hosted by Professor Sennur Ulukus. O’Shea’s talk was titled “Deep Learning in the Physical Layer: Building AI-Native Sensing and Communications Systems”.

ECE Launches 14th Year of Booz Allen Hamilton Colloquium Series
Data Engineering Podcast: Interactive Exploratory Data Analysis On Petabyte Scale Data Sets With Arkouda

Listen

Exploratory data analysis works best when the feedback loop is fast and iterative. This is easy to achieve when you are working on small datasets, but as they scale up beyond what can fit on a single machine those short iterations quickly become long and tedious. The Arkouda project is a Python interface built on top of the Chapel compiler to bring back those interactive speeds for exploratory analysis on horizontally scalable compute that parallelizes operations on large volumes of data. In this episode David Bader explains how the framework operates, the algorithms that are built into it to support complex analyses, and how you can start using it today.

Data Engineering Podcast: Interactive Exploratory Data Analysis On Petabyte Scale Data Sets With Arkouda
Hyperion Study Tracks Rise and Impact of Linux Supercomputers

By John Russell

That supercomputers produce impactful, lasting value is a basic tenet among the HPC community. To make the point more formally, Hyperion Research has issued a new report, The Economic and Societal Benefits of Linux Supercomputers. Inclusion of Linux is fundamental here. The powerful, open source operating system was embraced early by the HPC world and helped spawn a huge HPC application ecosystem that makes these systems so broadly useful.

Hyperion Study Tracks Rise and Impact of Linux Supercomputers
Random but Memorable Podcast: Minority Report Super Computer with David Bader

Listen

EPISODE SUMMARY

This week we discover the real-world capabilities of supercomputers in cybersecurity and how data analysis can uncover insider threats with Distinguished Professor David Bader. We also wind back the clock and look at how far computing has come, from David’s work building the first ever Linux supercomputer to the revolutionary chip inside the PlayStation 3.

Random but Memorable Podcast: Minority Report Super Computer with David Bader
ITSPmagazine Podcast: Large-Scale Data Analytics For Cybersecurity And Solving Real-World Grand Challenges

Listen

EPISODE SUMMARY

We may see new “graph” processors in the future that can better handle the data-centric computations in data science. Will that be enough?

EPISODE NOTES

We may see new “graph” processors in the future that can better handle the data-centric computations in data science. Will that be enough?

ITSPmagazine Podcast: Large-Scale Data Analytics For Cybersecurity And Solving Real-World Grand Challenges
David Bader Named ACM Fellow for Career in Data Science and Supercomputing

Written by: Evan Koblentz

The ACM this week announced NJIT Distinguished Professor David Bader in its new class of Fellows.

Bader received the honor along with 70 peers in the computing industry, representing just 1 percent of Association for Computing Machinery membership.

David Bader Named ACM Fellow for Career in Data Science and Supercomputing
IT Visionaries: Academics and Data Science Innovation with Dr. David Bader, Distinguished Professor and Director, Institute for Data Science, New Jersey Institute of Technology

Podcast:

The data science field is expanding because so many businesses and other institutions require skilled workers who can manage data as well as provide insights. Companies and students are clamoring for more academic programs. There is great need, but academic institutions are still transitioning to meet the demand. Dr. David Bader, Distinguished Professor and Director of the Institute for Data Science at the New Jersey Institute of Technology, explains how his school is leading the charge to create opportunities for more students to study data science.

IT Visionaries: Academics and Data Science Innovation with Dr. David Bader, Distinguished Professor and Director, Institute for Data Science, New Jersey Institute of Technology
SC21 General Chair Bronis R. de Supinski Recaps the First-Ever Hybrid SC Conference

by Bronis R. de Supinski

SC21 Marks a Number of Firsts

It was the first time we hosted SC in St. Louis, where the famous Gateway Arch rises over the banks of the Mississippi River. It was the first time we delved into how the power of HPC is expanding to other disciplines, beyond traditional science and to areas such as arts and humanities. The SC21 theme, “Science & Beyond” represents that expansion and was highlighted in many ways, including in the SC21 Keynote by Dr. Vint Cerf, one of the “fathers of the internet,” who shared on-stage how advanced computing has a groundbreaking effect on how we can better appreciate and understand the study of languages and literatures, the arts, history and philosophy.

SC21 General Chair Bronis R. de Supinski Recaps the First-Ever Hybrid SC Conference
A. James Clark School of Engineering, Silver Terps Reunion, Celebrating the Classes of 1996 to 1981

David Bader
Ph.D. 1996 Electrical and Computer Engineering

David A. Bader is a Distinguished Professor and founder of the Department of Data Science and inaugural Director of the Institute for Data Science at New Jersey Institute of Technology. Prior to this, he served as founding Professor and Chair of the School of Computational Science and Engineering at Georgia Institute of Technology. Dr. Bader is a Fellow of the IEEE, AAAS, and SIAM, and a recipient of the IEEE Sidney Fernbach Award. He advises the White House, most recently on the National Strategic Computing Initiative and Future Advanced Computing Ecosystem. Bader is a leading expert in solving global grand challenges in science, engineering, computing, and data science. His interests are at the intersection of high-performance computing and real-world applications, including cybersecurity, massive-scale analytics, and computational genomics. Dr. Bader is Editor-in-Chief of the ACM Transactions on Parallel Computing and previously served as Editor-in-Chief of the IEEE Transactions on Parallel and Distributed Systems. In 2021, ROI-NJ recognized Bader on its inaugural list of technology influencers, and in 2012, Bader was the inaugural recipient of University of Maryland’s Electrical and Computer Engineering Distinguished Alumni Award. In 1998, Bader built the first Linux supercomputer that led to a highperformance computing (HPC) revolution.

A. James Clark School of Engineering, Silver Terps Reunion, Celebrating the Classes of 1996 to 1981
Alumnus David Bader Receives 2021 Sidney Fernbach Award
Alumnus David Bader (Ph.D., ’96)
Alumnus David Bader (Ph.D., ’96)

Electrical and Computer Engineering Alumnus David Bader (Ph.D., ’96) is the recipient of the 2021 Sidney Fernbach Award from the IEEE Computer Society (IEEE CS). Bader is a Distinguished Professor and founder of the Department of Data Science, and inaugural Director of the Institute for Data Science, at the New Jersey Institute of Technology.

Alumnus David Bader Receives 2021 Sidney Fernbach Award
David Bader Selected for 2021 IEEE Computer Society Sidney Fernbach Award

LOS ALAMITOS, Calif., 22 September 2021 – The IEEE Computer Society (IEEE CS) has named David Bader as the recipient of the 2021 Sidney Fernbach Award. Bader is a Distinguished Professor and founder of the Department of Data Science, and inaugural Director of the Institute for Data Science, at the New Jersey Institute of Technology.

David Bader Selected for 2021 IEEE Computer Society Sidney Fernbach Award
David Bader Selected to Receive the 2021 IEEE Computer Society Sidney Fernbach Award

LOS ALAMITOS, Calif., 22 September 2021 – The IEEE Computer Society (IEEE CS) has named David Bader as the recipient of the 2021 Sidney Fernbach Award. Bader is a Distinguished Professor and founder of the Department of Data Science, and inaugural Director of the Institute for Data Science, at the New Jersey Institute of Technology.

David Bader Selected to Receive the 2021 IEEE Computer Society Sidney Fernbach Award
NJIT's David Bader Selected to Receive the 2021 IEEE Computer Society Sidney Fernbach Award

The IEEE Computer Society (IEEE CS) has named David Bader as the recipient of the 2021 Sidney Fernbach Award. Bader is a Distinguished Professor and founder of the Department of Data Science, and inaugural Director of the Institute for Data Science, at the New Jersey Institute of Technology.

NJIT's David Bader Selected to Receive the 2021 IEEE Computer Society Sidney Fernbach Award
David Bader Selected to Receive the 2021 IEEE Computer Society Sidney Fernbach Award

LOS ALAMITOS, Calif., 22 September 2021 – The IEEE Computer Society (IEEE CS) has named David Bader as the recipient of the 2021 Sidney Fernbach Award. Bader is a Distinguished Professor and founder of the Department of Data Science, and inaugural Director of the Institute for Data Science, at the New Jersey Institute of Technology.

David Bader Selected to Receive the 2021 IEEE Computer Society Sidney Fernbach Award
TED Talk At NJIT: Impact Of Tech In A Resurgent, Post-COVID World

By Eric Kiefer, Patch Staff

NEWARK, NJ — The following news release comes courtesy of NJIT. Learn more about posting announcements or events to your local Patch site.

Speakers at TEDxNJIT 2021 will explain how technology impacts everything from knee-replacement surgery and the monitoring of traumatic brain injuries to how we’ll live in the wake of the global pandemic.

TED Talk At NJIT: Impact Of Tech In A Resurgent, Post-COVID World
“Bad” Voting Machine Rejected by New York

New York City - February 2, 2021 – The New York State Board of Elections unanimoulsy rejected certification of a voting machine called the ExpressVote XL at a special late January meeting. The machine, made by ES&S, is referred to as a “hybrid” or “all-in-one” voting machine because it combines voting and tabulation in a single device. Rather than tabulating hand-marked paper ballots, the practice recommended by security experts, the ExpressVote XL generates a computer-printed summary card for each voter. The summary cards contain barcodes representing candidates’ names, and the machine tabulates votes from the barcodes. Security experts warn that the system “could change a vote for one candidate to be a vote for another candidate,” if it were hacked. Colorado, a leader in election security, has banned barcodes in voting, due to the high risk.

“Bad” Voting Machine Rejected by New York
Technology VIPs, Including an Internet Pioneer, Visit NJIT for Inspiration

Written by: Evan Koblentz

Leonard Kleinrock, who devised the mathematical model for packet switching in 1962.
Leonard Kleinrock, who devised the mathematical model for packet switching in 1962.

TTI/Vanguard, a prestigious organization of technology industry executives who meet a few times each year to study and debate emerging innovations, chose to virtually visit New Jersey Institute of Technology this week for their latest intellectual retreat.

Technology VIPs, Including an Internet Pioneer, Visit NJIT for Inspiration
1st Algorithmic Breakthrough in 40 years for solving the Minimum Spanning Tree (MST) Replacement Edges problem

One of the most studied algorithms in computer science is called “Minimum Spanning Tree” or MST. In this problem, one is given a graph comprised of vertices and weighted edges, and asked to find a subset of edges that connects all of the vertices, and the total sum of their weights is as small as possible. Many real-world optimization problems are solved by finding a minumum spanning tree, such as lowest cost for distribution on road networks where intersections are vertices and weights could be length of the road or time to drive that segment. In 1926, Czech scientist Otakar Borůvka was the first to design an MST algorithm. Other famous approaches to solving MST are often given by the name of the scientist who designed MST algorithm in the late 1950’s such as Prim, Kruskal, and Dijkstra.

Shawn Cicoria, M.S. in Data Science, NJIT@JerseyCity

Shawn Cicoria, Principle Software Engineer Manager, Microsoft, discusses the M.S. in Data Science program at NJIT@JerseyCity, highlighting Prof. David A. Bader.

https://www.youtube.com/watch?v=xQt5GSgwk8k

Shawn Cicoria, M.S. in Data Science, NJIT@JerseyCity
NJIT Professor Receives Facebook Research Award for Data Science

The director of NJIT’s new Institute for Data Science has received an award from Facebook to support real-world analytics research. The research aims to develop faster learning patterns to make it easier for companies to extract actionable information from extremely large data sets.

NJIT Professor Receives Facebook Research Award for Data Science
NJIT to Establish New Institute for Data Science

Continuing its mission to lead in computing technologies, NJIT announced today that it will establish a new Institute for Data Science, focusing on cutting-edge interdisciplinary research and development in all areas pertinent to digital data. The institute will bring existing research centers in big data, medical informatics and cybersecurity together with new research centers in data analytics and artificial intelligence, cutting across all NJIT colleges and schools, and conduct both basic and applied research.

NJIT to Establish New Institute for Data Science
Future Computing Community of Interest Meeting

On August 5-6, 2019, I was invited to attend the Future Computing (FC) Community of Interest Meeting sponsored by the National Coordination Office (NCO) of NITRD. The Networking and Information Technology Research and Development (NITRD) Program is a formal Federal program that coordinates the activities of 23 member agencies to tackle multidisciplinary, multitechnology, and multisector cyberinfrastructure R&D needs of the Federal Government and the Nation. The meeting was held in Washington, DC, at the NITRD NCO office.

Future Computing Community of Interest Meeting
David Bader to Lead New Institute for Data Science at NJIT

Professor David Bader will lead the new Institute for Data Science at the New Jersey Institute of Technology. Focused on cutting-edge interdisciplinary research and development in all areas pertinent to digital data, the institute will bring existing research centers in big data, medical informatics and cybersecurity together to conduct both basic and applied research.

David Bader to Lead New Institute for Data Science at NJIT
NJIT to Establish New Institute for Data Science

Continuing its mission to lead in computing technologies, NJIT announced today that it will establish a new Institute for Data Science, focusing on cutting-edge interdisciplinary research and development in all areas pertinent to digital data. The institute will bring existing research centers in big data, medical informatics and cybersecurity together with new research centers in data analytics and artificial intelligence, cutting across all NJIT colleges and schools, and conduct both basic and applied research.

NJIT to Establish New Institute for Data Science
Facebook Research: Announcing the winners of the AI System Hardware/Software Co-Design research awards

In January, Facebook invited university faculty to respond to a call for research proposals on AI System Hardware/Software Co-Design. Co-design implies simultaneous design and optimization of several aspects of the system, including hardware and software, to achieve a set target for a given system metric, such as throughput, latency, power, size, or any combination thereof. Deep learning has been particularly amenable to such co-design processes across various parts of the software and hardware stack, leading to a variety of novel algorithms, numerical optimizations, and AI hardware.

Facebook Research: Announcing the winners of the AI System Hardware/Software Co-Design research awards
NVIDIA AI Laboratory (NVAIL)

Georgia Tech, UC Davis, Texas A&M Join NVAIL Program with Focus on Graph Analytics

By Sandra Skaff

NVIDIA is partnering with three leading universities — Georgia Tech, the University of California, Davis, and Texas A&M — as part of our NVIDIA AI Labs program, to build the future of graph analytics on GPUs.

NVIDIA AI Laboratory (NVAIL)
Chronicle of Higher Education: Gazette

The Society for Industrial and Applied Mathematics selected 28 fellows for 2019 in recognition of their research and service to the community.

David A. Bader, a professor and chair of computational science and engineering at the Georgia Institute of Technology, for contributions in high-performance algorithms and streaming analytics and for leadership in the field of computational science.

Chronicle of Higher Education: Gazette
SIAM Announces Class of 2019 Fellows

SIAM Recognizes Distinguished Work through Fellows Program

Society for Industrial and Applied Mathematics (SIAM) is pleased to announce the 2019 Class of SIAM Fellows. These distinguished members were nominated for their exemplary research as well as outstanding service to the community. Through their contributions, SIAM Fellows help advance the fields of applied mathematics and computational science.

SIAM Announces Class of 2019 Fellows
Solving Real-World Problems: 5-Minute Interview with David Bader

Solving Real-World Problems: 5-Minute Interview with David Bader, Professor at Georgia Tech

When David Bader started working with graphs 25 years ago, it was a niche that required designing specific algorithms and even specific computers. Now the Neo4j graph database is used widely by analysts and researchers who work with Georgia Tech, rapidly asking questions and visualizing results.

Solving Real-World Problems: 5-Minute Interview with David Bader
CSE Chair David Bader Named Editor-in-Chief of ACM Transactions on Parallel Computing

School of Computational Science and Engineering Chair and Professor David Bader has been named Editor-in-Chief (EiC) of ACM Transactions on Parallel Computing (ACM ToPC).

ACM Transactions on Parallel Computing is a forum for novel and innovative work on all aspects of parallel computing, and addresses all classes of parallel-processing platforms, from concurrent and multithreaded to clusters and supercomputers.

CSE Chair David Bader Named Editor-in-Chief of ACM Transactions on Parallel Computing
ACM Transactions on Parallel Computing Names David Bader as Editor-in-Chief

ACM Transactions on Parallel Computing (TOPC) welcomes David Bader as new Editor-in-Chief, for the term November 1, 2018 to October 31, 2021. David is a Professor and Chair in the School of Computational Science and Engineering and College of Computing at Georgia Institute of Technology.

ACM Transactions on Parallel Computing Names David Bader as Editor-in-Chief
David Bader on Real World Challenges for Big Data Analytics

Bader PASC18 interview

In this video from PASC18, David Bader from Georgia Tech summarizes his keynote talk on Big Data Analytics.

“Emerging real-world graph problems include: detecting and preventing disease in human populations; revealing community structure in large social networks; and improving the resilience of the electric power grid. Unlike traditional applications in computational science and engineering, solving these social problems at scale often raises new challenges because of the sparsity and lack of locality in the data, the need for research on scalable algorithms, and development of frameworks for solving these real-world problems on high performance computers, and for improved models that capture the noise and bias inherent in the torrential data streams. In this talk, Bader will discuss the opportunities and challenges in massive data-intensive computing for applications in social sciences, physical sciences, and engineering.”

David Bader on Real World Challenges for Big Data Analytics
Massive-Scale Analytics Applied to Real-World Problems

Bader PASC18 interview

In this keynote video from PASC18, David Bader from Georgia Tech presents: Massive-Scale Analytics Applied to Real-World Problems.

“Emerging real-world graph problems include: detecting and preventing disease in human populations; revealing community structure in large social networks; and improving the resilience of the electric power grid. Unlike traditional applications in computational science and engineering, solving these social problems at scale often raises new challenges because of the sparsity and lack of locality in the data, the need for research on scalable algorithms and development of frameworks for solving these real-world problems on high performance computers, and for improved models that capture the noise and bias inherent in the torrential data streams. In this talk, Bader will discuss the opportunities and challenges in massive data-intensive computing for applications in social sciences, physical sciences, and engineering.”

Massive-Scale Analytics Applied to Real-World Problems
"I Lost 50 Pounds Making One Simple Change"

By Jen Babakhan

More than one-third of Americans are obese: We’re facing a national crisis, and solutions are in short supply. Here’s how one man turned a personal tracker into weight-loss success, one step at a time.

"I Lost 50 Pounds Making One Simple Change"
Bennett University, IEEE hold global meet on machine learning and data science
The conference coincided with the recent MoU signed between Bennett University and Nvidia, making it the first educational institute in the country to get the DGX-1V100 AI supercomputer.
The conference coincided with the recent MoU signed between Bennett University and Nvidia, making it the first educational institute in the country to get the DGX-1V100 AI supercomputer.

NEW DELHI: Bennett University’s computer science and engineering (CSE) department held its first international conference on machine learning and data science at its Greater Noida campus that saw researchers and academicians deliberating on the new wave of technologies and their impact on the world of big data, machine learning and artificial intelligence (AI).

Bennett University, IEEE hold global meet on machine learning and data science
15th Graph500 List Reveals Top Machines for Running Data Applications

The 15th Graph500 list – which ranks supercomputers based on how quickly they can build knowledge from massive-scale data sets – was released Nov. 15 at Supercomputing 2017 (SC17), with Japan’s K-Computer defending its position in the number-one spot several years in a row.

15th Graph500 List Reveals Top Machines for Running Data Applications
Georgia Tech Professor Helps Set White House’s HPC Agenda

Georgia Tech Professor David Bader, chair of the School of Computational Science and Engineering (CSE), participated in the National Strategic Computing Initiative (NSCI) Anniversary Workshop in Washington D.C., held July 29. Created in 2015 via an Executive Order by President Barack Obama, the NSCI is responsible for ensuring the United States continues leading in high-performance computing (HPC) in coming decades.

Georgia Tech Professor Helps Set White House’s HPC Agenda
What Exactly Is a 'Flop,' Anyway?

By Michael Byrne

Earlier this week, President Obama signed an executive order creating the National Strategic Computing Initiative, a vast effort at creating supercomputers at exaflop scales. Cool. An exaflop-scale supercomputer is capable of 1018 floating point operations (FLOPS) per second, which is a whole lot of FLOPS.

What Exactly Is a 'Flop,' Anyway?
IBM, Nvidia rev their HPC engines in next-gen supercomputer push

By Katherine Noyes, Senior U.S. Correspondent, IDG News Service

Hard on the heels of the publication of the latest Top 500 ranking of the world’s fastest supercomputers, IBM and Nvidia on Monday announced they have teamed up to launch two new supercomputer centers of excellence to develop the next generation of contenders.

IBM, Nvidia rev their HPC engines in next-gen supercomputer push
Accenture Awards 11 Research Grants to Leading Universities to Promote Greater R&D Collaboration, Accelerate Pace of Innovation

Accenture (NYSE:ACN) has awarded 11 research grants to top universities around the world to significantly broaden and deepen the relationships between Accenture’s technology research and development (R&D) groups and leading university researchers.

Accenture Awards 11 Research Grants to Leading Universities to Promote Greater R&D Collaboration, Accelerate Pace of Innovation
Google Can Now Describe Your Cat Photos

By Rolfe Winkler

Google’s trained computers recognized that this is a photo of “two pizzas sitting on top of a stove top oven” *Google*
Google’s trained computers recognized that this is a photo of “two pizzas sitting on top of a stove top oven” Google

Google ’s computers learned to recognize cats in photos. Now, they’re learning to describe cats playing with a ball of string.

Google Can Now Describe Your Cat Photos
College of Computing Picks Bader to Lead School of CSE

Following a national search for new leadership of its School of Computational Science and Engineering (CSE), Georgia Tech’s College of Computing has selected its own David A. Bader, a renowned leader in high-performance computing, to chair the school.

College of Computing Picks Bader to Lead School of CSE
Opening Up the Accelerator Advantage

By Tiffany Trader

Researchers at Georgia Institute of Technology and University of Southern California will receive nearly $2 million in federal funding for the creation of tools that will help developers exploit hardware accelerators in a cost-effective and power-efficient manner. The purpose of this three-year NSF grant is to bring formerly niche supercomputing capabilities into the hands of a more general audience to help them achieve high-performance for applications that were previously deemed hard to optimize. The project will involve the use of tablets, smart phones and other Internet-era devices, according to David Bader, the lead principal investigator.

Opening Up the Accelerator Advantage
Understanding the Human Condition with Big Data and HPC

In this guest feature from Scientific Computing World, Georgia Institute of Technology’s David A. Bader discusses his upcoming ISC’13 session, Better Understanding Brains, Genomes & Life Using HPC Systems.

Supercomputing at ISC has traditionally focused on problems in areas such as the simulation space for physical phenomena. Manufacturing, weather simulations and molecular dynamics have all been popular topics, but an emerging trend is the examination of how we use high-end computing to solve some of the most important problems that affect the human condition.

Understanding the Human Condition with Big Data and HPC
World's Most Powerful Big Data Machines Charted on Graph 500

By Joab Jackson, U.S. Correspondent, IDG News Service

The Top500 is no longer the only ranking game in town: make way for the Graph 500, which tracks how well supercomputers handle big-data-styled workloads.

World's Most Powerful Big Data Machines Charted on Graph 500
World's Most Powerful Big Data Machines Charted on Graph 500

By Joab Jackson, U.S. Correspondent, IDG News Service

The Top500 is no longer the only ranking game in town: make way for the Graph 500, which tracks how well supercomputers handle big-data-styled workloads.

World's Most Powerful Big Data Machines Charted on Graph 500
Why Do Super Computers Use Linux?

In our last few posts we discussed the fact that over 90% supercomputers (94.2% to be precise) employ Linux as their operating system. In this post, a sequel to our last posts, we shall attempt to investigate the potentials of Linux which make it suitable and perhaps the best choice for supercomputers OS.

Why Do Super Computers Use Linux?
Data Analysts Seek to Make Social Media More Useful

It’s not easy turning the Mayberry Police Department into the team from CSI, or turning an idea for a new type of social network analysis into something like Klout on steroids, but those types of transformations are becoming ever more realistic. The world’s universities and research institutions are hard at work figuring out ways to make the mountains of social data generated every day more useful and, hopefully, make us realize there’s more to social data than just figuring out whose digital voice is the loudest. Aspiring heirs to the Klout throne, for example, might look to a project called Stinger now under development at Georgia Institute of Technology. Stinger, which stands for Spatio-Temporal Interaction Networks and Graphs Extensible Representation, is a graph-processing engine that project lead David Bader says is bigger, faster, and more flexible than anything currently in use for analyzing social media connections. You provide a shared-memory computing system, and it provides an open-source tool that can help detect relationships between billions of people, places, and things as those relationships change over time—even in real time. Someone using Facebook (FB) data, for example, might write an algorithm where people or pages would be the vertices and actions (likes, shares, wall posts, etc.) would be the graph’s edges. One relatively easy application, Bader explains, would be to analyze how activity around particular people is increasing, decreasing, or changing, therefore indicating changes in their importance or the growth of new communities. Writing an algorithm to perform that kind of analysis isn’t really the problem, though—it’s writing one that can scale into the billions of vertices and edges and still perform quickly enough to be useful. An algorithm that generates one false positive in a million isn’t so bad when you’re dealing with tens of thousands of items, Bader says, but it gets to be a big problem when you’re talking about billions of items against which it’s running. There are dozens of open-source graph databasesavailable, including popular offerings such as Neo4j andInfiniteGraph. But, Bader says, “our lab focuses on algorithms that run fast on massive data sets and that are more accurate than what is traditionally done in social media.” Bader’s team recently presented a paper detailing a social media algorithm running atop Stinger that ran 100 times faster than some previous approaches because the system stores the graph’s previous state and performs only the minimal amount of processing necessary as new edges are inserted. This is in contrast to traditional approaches that reprocess the entire graph every time there’s a change. That being said, Georgia Tech isn’t alone in analyzing massive amounts of social data with graph databases.Google’s (GOOG) Pregel had already scaled to billions of vertices and edges as of 2009, and Facebook is currently analyzing more than a billion edges using Apache Giraph(an open-source, Hadoop-based Pregel implementation). But those cases—both companies are loaded with smart engineers, data scientists, and powerful infrastructure—just underscore the importance of what researchers like Bader are building and releasing as open source.

Data Analysts Seek to Make Social Media More Useful
How DARPA Does Big Data

By Nicole Hemsoth

The world lost one of its most profound science fiction authors in the early eighties, long before the flood of data came down the vast virtual mountain.

It was a sad loss for literature, but it also bore a devastating hole in the hopes of those seeking a modern fortuneteller who could so gracefully grasp the impact of the data-humanity dynamic. Dick foresaw a massively connected society—and all of the challenges, beauties, frights and potential for invasion or (or safety, depending on your outlook).

How DARPA Does Big Data
University of Maryland Distinguished Alumni Award

In 2011, The University of Maryland’s Department of Electrical and Computer Engineering established the Distinguished Alumni Award to recognize alumni who have made significant and meritorious contributions to their fields. Alumni are nominated by their advising professors or the department chair, and the Department Council then approves their selection. In early May, the faculty and staff gather to honor the recipients at a luncheon.

University of Maryland Distinguished Alumni Award
AAAS Members Elected as Fellows

In November 2011, the AAAS Council elected 539 members as Fellows of AAAS. These individuals will be recognized for their contributions to science and technology at the Fellows Forum to be held on 18 February 2012 during the AAAS Annual Meeting in Vancouver, British Columbia. The new Fellows will receive a certificate and a blue and gold rosette as a symbol of their distinguished accomplishments.

AAAS Members Elected as Fellows
Pentagon to Develop Computer System to Prevent Another Wikileaks

The Pentagon is currently working to prevent another WikiLeaks situation within the department. Wired.com recently reported a group of military funded scientists are developing a sophisticated computer system that can scan and interpret every key stroke, log-in and uploaded files over the Pentagon’s networks.

Pentagon to Develop Computer System to Prevent Another Wikileaks
Big Brothers, PRODIGAL Sons, and Cybersecurity

By Julian Sanchez

I wrote on Monday that a cybersecurity bill overwhelmingly approved by the House Permanent Select Committee on Intelligence risks creating a significantly broader loophole in federal electronic surveillance law than its boosters expect or intend. Creating both legal leeway and a trusted environment for limited information sharing about cybersecurity threats—such as the idenifying signatures of malware or automated attack patterns—is a good idea. Yet the wording of the proposed statute permits broad collection and disclosure of any information that would be relevant to protecting against “cyber threats,” broadly defined. For now, that mostly means monitoring the behavior of software; in the near future, it could as easily mean monitoring the behavior of people.

Big Brothers, PRODIGAL Sons, and Cybersecurity
Sifting through petabytes: PRODIGAL monitoring for lone wolf insider threats

By Darlene Storm

Homeland Security Director Janet Napolitano said the “risk of ’lone wolf’ attackers, with no ties to known extremist networks or grand conspiracies, is on the rise as the global terrorist threat has shifted,” reported CBSNews. An alleged example of such a lone wolf terror suspect is U.S. citizen Jose Pimentel, who learned “bomb-making on the Internet and considered changing his name to Osama out of loyalty to Osama bin Laden.” He was arrested on charges of “plotting to blow up post offices and police cars and to kill U.S. troops.” But the CSMonitor reported the FBI decided Pimentel was not a credible threat. It’s unlikely Pimentel will be able to claim “entrapment” since he “left muddy footprints on the Internet” which proves “his intent was to cause harm.” The grand jury decision against Pimentel was delayed until January, as others described “the Idiot Jihadist Next Door” as just another “homegrown U.S. terrorist wannabe.”

Sifting through petabytes: PRODIGAL monitoring for lone wolf insider threats
Interview: DARPA's ADAMS Project Taps Big Data to Find the Breaking Bad

By Rich Brueckner, insideHPC

In this video, Professor David Bader from Georgia Tech discusses his participation in the DARPA ADAMS project. The Anomaly Detection at Multiple Scales (ADAMS) program uses Big Data Analytics to look for cleared personnel that might be on the verge of “Breaking Bad” and becoming internal security threats.

Interview: DARPA's ADAMS Project Taps Big Data to Find the Breaking Bad
CODONiS Teaming with ISB to Meet Big Challenge: Managing Terabytes of Personal Biomedical Data

Seattle-based CODONiS, a provider of advanced computing platforms for life sciences and healthcare, has teamed up with scientists from the world-renowned Institute for Systems Biology, a nonprofit research organization in Seattle, to advance biomedical computing for future personalized healthcare. The results from this ground-breaking collaboration will be discussed at the “Personalized Healthcare Challenges for High Performance Computing” panel discussion being held at the SC11 Conference in Seattle on November 15, 2011.

CODONiS Teaming with ISB to Meet Big Challenge: Managing Terabytes of Personal Biomedical Data
IEEE Computer Society Golden Core Award

In 2010, David A. Bader received the IEEE Computer Society Golden Core Award. A plaque is awarded for long-standing member or staff service to the society. This program was initiated in 1996 with a charter membership of 450. Each year the Awards Committee will select additional recipients from a continuing pool of qualified candidates and permanently include their names in the Golden Core Member master list.

IEEE Computer Society Golden Core Award
DARPA Sets Ubiquitous HPC Program in Motion

By Michael Feldman

The US Defense Advanced Research Projects Agency (DARPA) has selected four “performers” to develop prototype systems for its Ubiquitous High Performance Computing (UHPC) program. According to a press release issued on August 6, the organizations include Intel, NVIDIA, MIT, and Sandia National Laboratory. Georgia Tech was also tapped to head up an evaluation team for the systems under development. The first UHPC prototype systems are slated to be completed in 2018.

DARPA Sets Ubiquitous HPC Program in Motion
Executive Guide to SC08: Tuesday

By Michael Feldman

Tuesday marks the first full day of the conference technical program. This year’s conference keynote will be given by Michael Dell, chairman and CEO of Dell, Inc. Dell’s selection reflects both the changing face of the industry, and the conference’s location – Dell is headquartered about 20 miles north of Austin in Round Rock, Texas.

Executive Guide to SC08: Tuesday
NPR interviews David Bader on the chip in the PlayStation 3

NPR radio interview on Atlanta’s 90.1 WABE station

In this radio interview, Atlanta’s NPR station 90.1 WABE’s John Lemley interviews David Bader on the IBM Cell microchip in the Sony PlayStation 3 that could save lives.

NPR interviews David Bader on the chip in the PlayStation 3
Featured Keynote Speaker: David Bader at Third Annual High Performance Computing Day at Lehigh University

Third Annual High Performance Computing Day at Lehigh

http://www.lehigh.edu/computing/hpc/hpcday.html

Friday, April 4, 2008
Featured Keynote Speaker

David Bader ‘90, ‘91G
Petascale Phylogenetic Reconstruction of Evolutionary Histories
http://www.lehigh.edu/computing/hpc/hpcday/2008/hpckeynote.html
Executive Director of High Performance Computing
College of Computing, Georgia Institute of Technology

Featured Keynote Speaker: David Bader at Third Annual High Performance Computing Day at Lehigh University
Georgia Tech 'CellBuzz' Cluster in Production Use

Georgia Tech is one of the first universities to deploy the IBM BladeCenter QS20 Server for production use, through Sony-Toshiba-IBM (STI) Center of Competence for the Cell Broadband Engine (http://sti.cc.gatech.edu/) in the College of Computing at Georgia Tech. The QS20 uses the same ground-breaking Cell/B.E. processor appearing in products such as Sony Computer Entertainment’s PlayStation3 computer entertainment system, and Toshiba’s Cell Reference Set, a development tool for Cell/B.E. applications.

Georgia Tech 'CellBuzz' Cluster in Production Use
Georgia Tech to Host Cell BE Workshop

Georgia Tech will be hosting a two-day workshop on software and applications for the Cell Broadband Engine, to be held on Monday, June 18 and Tuesday, June 19, at the Klaus Advanced Computing Building, (http://www.cc.gatech.edu/ ) at Georgia Institute of Technology, in Atlanta, GA, United States. The workshop is sponsored by Georgia Tech and the Sony, Toshiba, IBM, (STI) Center of Competence for the Cell BE.

Georgia Tech to Host Cell BE Workshop
Bader Receives 2006 IBM Faculty Award

Congratulations to Associate Professor David Bader who recently received a 2006 IBM Faculty Award in recognition of his outstanding achievement and importance to industry. The highly competitive award, valued at $40,000, was given to Bader for making fundamental contributions to the design and optimization of parallel scientific libraries for multicore processors, such as the IBM Cell. As an international leader in innovation for the most advanced computing systems, IBM recognizes the strength of collaborative research with the College of Computing at Georgia Tech’s Computational Science and Engineering (CSE) division.

Bader Receives 2006 IBM Faculty Award
Alumnus Bader Joins DSPlogic Advisory Board
David Bader
David Bader

Ph.D. alumnus David Bader ‘96, Associate Professor of Computational Science and Engineering at Georgia Tech, has joined the Technical Advisory Board of DSPlogic, a provider of FPGA-based, reconfigurable computing and signal processing products and services. Bader has been a pioneer in the field of high performance computing for problems in bioinformatics and computational genomics, and has co-authored over 75 articles in peer-reviewed journals and conferences. His main areas of research are in parallel algorithms, combinatorial optimization, and computational biology and genomics.

Alumnus Bader Joins DSPlogic Advisory Board
Obituary: Morris Bader

Morris Bader, 72, of Bethlehem, died peacefully at home on Thursday, April 21, 2005. Born: In New York, he was a son of the late Louis and Esther Saltzman Bader. Personal: He and his wife, the former Karen Roberts, were married for 45 years. He was a graduate of Stuyvesant High School in New York City. He was a 1953 graduate of the City University of New York, formerly City College of New York and earned his Ph.D. in physical chemistry at Indiana University, Bloomington, Ind. He taught at New York University, Marietta College in Marietta, Ohio, and Moravian College. He was an emeritus professor of chemistry at Moravian College. He taught chemistry and computer science from 1962 until his retirement in 1995. He also taught physical chemistry, developed the initial computer science program, conceived and funded SOAR program for funding student and faculty summer research, and collaborated and developed a plant growth hormone. He was a scientific glassblower, making much of his own equipment. He developed scientific programs and published five computer manuals and software which sold worldwide, the profits of which were donated to assist faculty research travel to conferences. He developed the course, “Chemistry for the Non-Science Major” and his paper “A Systematic Approach to Standard Addition Methods in Instrumental Analysis” is highly cited and used widely in practice. Morris holds two patents, one for a bicycle gearing system and one for a quartz infrared cell; both manufactured. He has published numerous articles in numerical scientific computation for chemical analysis, solution of hard differential equations, and improved accuracy and error analysis in numerical computing. His chemistry publications include various chemical experiments for use by educators, and guideline and error estimates for the neglect of buoyancy in laboratory weighings. He has published in the Journal of Chemical Education and in American Laboratory of which he was a contributing editor. Memberships: He was a championship chess player and the Moravian College Chess Club Advisor. He supported the Moravian College Foreign Film Festival. His many volunteering activities included teaching swimming to toddlers at the 3rd Street Alliance, Easton, Rodale Theater, State Theatre, Easton. He was a 21 year Musikfest volunteer, Lehigh Valley Hospital-Muhlenberg Hospital. He was financial advisor to the Friendship Circle of the J.C.C., and was assistant Scoutmaster of Troops 304 and 346 in the Minsi Trails Council, Boy Scouts of America. Morris was a member of Congregation Beth Avraham, formerly Agudath Achim of Bethlehem. He was a member of the board and then president for 20 years. He read Torah and led services.

Obituary: Morris Bader
IEEE Distributed Systems Online Names First Editorial Board

IEEE Distributed Systems Online, the IEEE’s first online-only publication has named the charter members of its editorial board. Editor-in-Chief Jean Bacon culled the 17 board members from leading professionals within academia and the distributed systems industry worldwide.

IEEE Distributed Systems Online Names First Editorial Board
UNM Computing Faculty Collaborating with IBM to Design Next-Gen Supercomputer

UNM Mirage

UNM Computing faculty David A. Bader, Patrick Bridges, Arthur B. Maccabe and Bernard Moret, are collaborating on IBM’s Productive, Easy-to-use, Reliable, Computing Systems (PERCS) project, a new initiative to design a supercomputer several orders of magnitude faster than today’s high-end systems.

UNM Computing Faculty Collaborating with IBM to Design Next-Gen Supercomputer
UNM Professor Bader selected as speaker for national program

UNM Mirage

David A. Bader, assistant professor in the Electrical and Computer Engineering Department, has been selected as an Institute of Electrical and Electronics Engineers (IEEE) Computer Society Distinguished Speaker. Bader is named to a group of about three dozen speakers from throughout the country and will serve a three-year term.

UNM Professor Bader selected as speaker for national program
UNM Engineering Professors receive $1.1 million in NSF Grants

UNM Mirage

University of New Mexico School of Engineering Professors Bernard Moret, Computer Science, and David Bader, Electrical and Computer Engineering, have received more than $1.1 million in grants this fall from the National Science Foundation (NSF) to pursue research in reconstructing evolutionary trees (known as “phylogenies”).

UNM Engineering Professors receive $1.1 million in NSF Grants
UNM Engineering Professors receive NSF CAREER Awards

UNM Mirage

University of New Mexico School of Engineering professors David A. Bader and Hy D. Tran recently received the National Science Foundation Faculty Early Career Development (CAREER) Awards.

UNM Engineering Professors receive NSF CAREER Awards
GRAPPA Runs In A Record Time

Using the largest open-production Linux supercluster in the world, LosLobos, researchers at The University of New Mexico’s Albuquerque High Performance Computing Center ( http://www.ahpcc.unm.edu ) have achieved a nearly one-million-fold speedup in solving the computationally-hard phylogeny reconstruction problem for the family of twelve Bluebell species (scientific name: Campanulacae) from the flowers’ chloroplast gene order data. (The problem size includes a thirteenth plant, Tobacco, used as a distantly-related outgroup). Phylogenies derived from gene order data may prove crucial in answering some fundamental open questions in biomolecular evolution. Yet very few techniques are available for such phylogenetic reconstructions.

GRAPPA Runs In A Record Time
Access ParaScope from Concurrency’s home page

IEEE Concurrency ’s home page (http://computer.org/concurrency/) now includes a link to ParaScope, a comprehensive listing of parallel computing sites on the Internet. The list is maintained by David A. Bader, assistant professor in the University of New Mexico’s Department of Electrical and Computer Engineering. You can also go directly to the links at http://computer.org/parascope/ #parallel.

Access ParaScope from Concurrency’s home page
First Place, CNIU20 Microcomputer Contest

Colonial Intermediate Unit 20 (CNIU20) Microcomputer Contest
1st Place 4/18/86
David Bader

First Place, CNIU20 Microcomputer Contest
Eagle Scout Court of Honor

Eagle Scout Court of Honor