Skip to Content Find it Fast

This browser does not support Cascading Style Sheets.


If We Build it, They Will Come?
YOU MIGHT SAY, to use an anachronistic analogy, that the National Science Foundation put the cart before the horse as it readied itself to enter the world of “petaflop” computing.

Poised to build the first civilian computer capable of mind-boggling processing powers at the National Center for Supercomputer Applications in Urbana-Champaign, Illinois, the NSF wondered if, in fact, there would be anyone out there ready to take advantage of a computing speed capable of churning through one-million-billion operations per second—the petaflop.

So out went a call to the scientific community saying, in essence, “Get your applications ready” for such a supercomputing beast.

The call was answered by, among others, Joachim “Jimmy” Raeder and colleagues in the Space Science Center. The group’s proposal successfully garnered a $1.5 million, four-year contract from the NSF to ready their Open Geospace General Circulation Model–OpenGGCM for short.

Jimmy Raeder
Jimmy Raeder
photo: Kristi Donahue, UNH-EOS
OpenGGCM simulates nothing less than the interaction between Earth’s magnetic field (the magnetosphere) and the solar wind–a dance that, among other things, creates the aurora and space weather. So complex is this “magnetohydrodynamic” simulation that it is one of the so-called “grand challenges” of modern-day computational science (climate change being another) and, thus, is an application yearning for the power of the petaflop.

As luck would have it, Sony’s PlayStation3 game console uses a similar computer brain (a chip co-developed by Sony, IBM, and Toshiba and called the Cell Broadband Engine) that the NSF will likely use in its petaflop machine. And so, Raeder and his group have purchased 40 Sony PlayStation3 consoles and bundled them together into a supercomputing cluster to gain but a fraction of the prized petaflop.

“You need a lot of computing power to do games realistically, to run the graphics,” Raeder says of the PS3 computer superchip. He adds, “So we’re not the only ones taking this approach, there are lots of projects in which people have used PlayStations to do scientific calculations.”

PS3 cluster
Raeder points out that the cluster of PlayStations won’t come even close to petaflop power, which requires tens of thousands of superchips working in unison. “But it does provide us with a platform on which we can test our approaches to make the code very fast,” he says. (The Roadrunner Project, a high-performance computer being built at the Los Alamos National Laboratory, uses 13,000 Cell Broadband Engines.)

The superchip that drives the PlayStation3 runs some 50 times faster than that of a typical processor. Looked at another way, while a “normal” chip, which is about the size of your thumbnail, contains one to two central processing units or “brains,” the PS3 chip has seven to nine brains per thumbnail and can perform upwards of 1011 operations per second.

However, straight out of the box, it’s not plug-and-play for Raeder’s simulationists; there are several major hurdles to overcome before any science can be done.

First, the PS3 must be “tweaked” to run an open-source Linux operating system, otherwise scientists will only be able to use the console to play Grand Theft Auto IV, Star Wars: The Force Unleashed, Hail to the Chimp and other such mind-expanding activities.

The second and more challenging hurdle concerns the scientific program itself, in this case the OpenGGCM, which must be rewritten to run on the highly specialized superchip. This reprogramming fell to Kai Germaschewski, who spent the better part of two months completing this complex task. (SSC’s Germaschewski, and Doug Larson, Daniel Bergeron, and Andrew Foulks, from the Computer Science Department are all part of the PS3 project.)

Lastly, while the PS3 chip has speed to spare, it has little memory to speak of, which means only little chunks of a monster simulation like OpenGGCM can be run at any one time. But string a bunch of PlayStations together (cluster computing) and the problem is solved.

In this case, 40 PS3s talking to each other will churn through an OpenGGCM simulation and do so, relatively speaking, on the cheap. “If our calculations are right, we will be able to run our program on 40 of these at $400 piece, plus a few bucks for a rack, network switch, and cables,” Raeder says. In contrast, simulations are currently run on a supercomputing cluster (named “Zaphod” in honor of a character from the science fiction story The Hitchhiker’s Guide to the Galaxy) that is an 8,000-pound, $750,000 collection of 320 processors.

The thrust of the simulation work Raeder’s group does is connected with a NASA mission known as THEMIS for Time History of Events and Macroscale Interactions during Substorms. The mission aims to resolve one of the oldest mysteries in space physics–what physical process in near-Earth space initiates the violent eruptions of the aurora that occur during substorms in the Earth’s magnetosphere. The two-year mission consists of five identical probes that will study the violent, colorful eruptions of auroras.

Says Raeder, who is a co-investigator on the mission, “Our role is to help people understand the data they’re getting from the spacecraft with our simulations. The magnetosphere is huge, so even with five probes it’s kind of like having five thermometers in the U.S. and trying to figure out what the weather is going to be. Running these simulations helps us to better understand what’s going on in the real world.”

by David Sims, Science Writer, Institute for the Study of Earth, Oceans, and Space. Published in Spring 2008 issue of EOS Spheres.