homehome Home chatchat Notifications


Most powerful supercomputer dedicated to geosciences is now live

While climate change may be a subject of intense debate with equally enthusiastic supports on both sides of the fence, one thing anyone, no matter their side, shouldn’t argue is allocating resources for its study. Just recently, one of the most powerful tools for studying the planet’s climate in great detail has been powered up […]

Tibi Puiu
October 17, 2012 @ 12:32 pm

share Share

Yellowstone ncar supercomputer

Some of the Yellowstone supercomputer’s racks. A mosaic of the Yellowstone National Park was put in place as a tribute. (c) CARLYE CALVIN / NCAR

While climate change may be a subject of intense debate with equally enthusiastic supports on both sides of the fence, one thing anyone, no matter their side, shouldn’t argue is allocating resources for its study. Just recently, one of the most powerful tools for studying the planet’s climate in great detail has been powered up – the “Yellowstone” 1.5 petaflops supercomputer, which has already been listed under the top 20 supercomputers in the world list.

The system went live at the NCAR-Wyoming Supercomputing Center in Cheyenne, Wyoming where it was met with bold enthusiasm by meteorologists and geoscientists stationed there, and from the rest of the world for that matter. Yellowstone promises to aid scientists in performing complex climate models which should allow for studying anything from hurricanes and tornadoes to geomagnetic storms, tsunamis, wildfires, as well as locating resources such as oil miles beneath the Earth’s crust.

People “want to know what [climate change] is going to do to precipitation in Spain or in Kansas,” said Rich Loft, the director of technology development at the center.

The supercomputer can perform computations at 1.5 petaflops, which translates in a staggering 1,500 teraflops or 1.5 quadrillion calculations per second. Just so you can get an idea of both the kind of upgrade Yellowstone offers and the degree of technological advancements witnessed in the past few years, consider that NCAR’s previous supercomputer, Bluefire, which was commissioned in 2008,  peaked at 76 teraflops, yet still it was one of the most powerful supercomputers of its day.

The $70 million data center is comprised of 100 racks with 72,288 compute cores from Intel Sandy Bridge processors, a massive 144.6 terabyte storage farm and a system for visualizing all of its data.

A powerful tool for predicting our planet’s climate

All these numbers might not mean very much to you, but if you put its tasks into context, you suddenly become impressed. For instance, a short-term weather forecast which would typically require a few hours to complete for Bluefire, can be rendered in mere minutes by Yellowstone. But it’s not in speed where Yellowstone shines, but in the complex tasks it can undertake. Scientists typically create climate change models of a particular region by arranging it in 100 km wide grids, yet Yellowstone is capable of refining the resolution to as much as 10 km. This significant improvement allows for a more detailed and accurate assessment of climate change closer to reality.

“Scientists will be able to simulate these small but dangerous systems in remarkable detail, zooming in on the movement of winds, raindrops, and other features at different points and times within an individual storm. By learning more about the structure and evolution of severe weather, researchers will be able to help forecasters deliver more accurate and specific predictions, such as which locations within a county are most likely to experience a tornado within the next hour,” according to a NCAR statement.

Currently, 11 research projects have already been planned to make use of Yellowstone “to try to do some breakthrough science straight away and try to shake the machine,” according to NCAR officials.

 

share Share

Big Tech Said It Was Impossible to Create an AI Based on Ethically Sourced Data. These Researchers Proved Them Wrong

A massive AI breakthrough built entirely on public domain and open-licensed data

Lawyers are already citing fake, AI-generated cases and it's becoming a problem

Just in case you're wondering how society is dealing with AI.

Leading AI models sometimes refuse to shut down when ordered

Models trained to solve problems are now learning to survive—even if we tell them not to.

AI slop is way more common than you think. Here's what we know

The odds are you've seen it too.

Scientists Invented a Way to Store Data in Plastic Molecules and It Could Someday Replace Hard Drives

What if your next hard drive wasn’t a box, but a string of molecules? Synthetic polymers promises to revolutionize data storage.

Meet Cavorite X7: An aircraft that can hover like a helicopter and fly like a plane

This unusual hybrid aircraft has sliding panels on its wings that cover hidden electric fans.

AI is quietly changing how we design our work

AI reshapes engineering, from sketches to skyscrapers, promising speed, smarts, and new creations.

Inside the Great Firewall: China’s Relentless Battle to Control the Internet

On the Chinese internet, a river crab isn’t just a crustacean. It’s code. River crab are Internet slang terms created by Chinese netizens in reference to the Internet censorship, or other kinds of censorship in mainland China. They need to do this because the Great Firewall of China censors and regulates everything that is posted […]

Anthropic's new AI model (Claude) will scheme and even blackmail to avoid getting shut down

In a fictional scenario, Claude blackmailed an engineer for having an affair.

Grok Won’t Shut Up About “White Genocide” Conspiracy Theories — Even When Asked About HBO or Other Random Things

Regardless of the context Grok, it seems, is being used to actively push a topic onto its users.