US weather agency to boost supercomputers to 2.5 petaflops each

The National Oceanic and Atmospheric Administration (NOAA) plans to upgrade the performance of its two supercomputers with a roughly tenfold increase of capacity by October 2015, the agency said Monday. With the upgrade, the agency is hoping for more accurate and timely weather forecasts.

The supercomputer upgrade comes courtesy of a $44.5 million contract with [company]IBM[/company], which is subcontracting with Seattle-based supercomputer-maker Cray Inc. to improve the systems. Of that $44.5 million, the NOAA said that $25 million “was provided through the Disaster Relief Appropriations Act of 2013 related to the consequences of Hurricane Sandy.”

The National Weather Service (part of NOAA) will reap the benefits this month when the two supercomputers triple their current total capacity from 0.776 petaflops to 1.552 petaflops as part of the first step of the overhaul. With the bump in power, the National Weather Service will be able to run an upgraded version of its Global Forecast System with better resolution and longer weather forecasts.

Global Forecast System

Global Forecast System

When the upgrade is finished, each supercomputer should be able to handle a capacity of 2.5 petaflops, which makes for a total capacity of 5 petaflops.

While that’s a sizable increase of capacity, the world’s fastest supercomputer, China’s Tianhe-2, can deliver 55 peak petaflops.

In November, IBM announced that it would build two new supercomputers based on IBM’s OpenPower technology for the U.S. Department of Energy. Those new supercomputers should be functional by 2017 and will supposedly deliver more than 100 peak petaflops.

Researchers hope deep learning algorithms can run on FPGAs and supercomputers

The NSF has funded projects that will investigate how deep learning algorithms run on FPGAs and across systems using the high-performance RDMA interconnect. Another project, led by Andrew Ng and two supercomputing experts, wants to put the models on supercomputers and give them a Python interface.

Why Amazon thinks big data was made for the cloud

According to Amazon Web Services Chief Data Scientist Matt Wood, big data and cloud computing are nearly a match made in heaven. Limitless, on-demand and inexpensive resources open up new worlds of possibility, and a central platform makes it easy for communities to share huge datasets.

How federal money will spur a new breed of big data

By pumping hundreds of millions of dollars into big data research and development, the Obama administration thinks it can push the current state of the art well beyond what’s possible today, and into entirely new research areas. It’s a noble goal, but also a necessary one.