Trends

$600M Cray supercomputer will tower above the rest — to build better nukes

Cray has been commissioned by Lawrence Livermore National Laboratory to create a supercomputer head and shoulders above all the rest, with the contract valued at some $600 million. Disappointingly, El Capitan, as the system will be called, will be more or less solely dedicated to redesigning our nuclear armament.

El Capitan will be the third “exascale” computer being built by Cray for the U.S. government, the other two being Aurora for Argonne National Lab and Frontier for Oak Ridge. These computers are built on a whole new architecture called Shasta, in which Cray intends to combine the speed and scale of high performance computing with the easy administration of cloud-based enterprise tools.
Due for delivery in 2022, El Capitan will be operating on the order of 1.5 exaflops, or floating point operations per second, a measure of calculation often used to track supercomputer performance. Exa denotes a quintillion of something.
Right now the top dog is already at Oak Ridge: an IBM-built system called Sierra. At about 1.5 petaflops, it’s about 1/10th the power of Aurora — of course, the former is operational and the latter is theoretical right now, but you get the idea.
One wonders exactly what all this computing power is needed for. There are in fact countless domains of science that could be advanced by access to a system like El Capitan — simulations of atmospheric and geological processes, for instance, could be simulated in 3D at a larger scale and higher fidelity than ever before.

Climate change kills off clouds over the ocean in new simulation

So it was a bit disheartening to learn that El Capitan will, once fully operational, be dedicated almost solely to classified nuclear weaponry design.
To be clear, that doesn’t just mean bigger and more lethal bombs. The contract is being carried out with the collaboration of the National Nuclear Security Administration, which of course oversees the nuclear stockpile alongside the Department of Energy and military. It’s a big operation, as you might expect.
We have an aging nuclear weapons stockpile that was essentially designed and engineered over a period of decades ending in the ’90s. We may not need to build new ones, but we do actually have to keep our old ones in good shape, not just in case of war but to prevent them failing in their advancing age and decrepitude.

shasta

The components of Cray’s Shasta systems.

“We like to say that while the stockpile was designed in two dimensions, it’s actually aging in three,” said LLNL director Bill Goldstein in a teleconference call on Monday. “We’re currently redesigning both warhead and delivery system. This is the first time we’ve been doing done this for about 30 years now. This requires us to be able to simulate the interaction between the physics of the nuclear system and the engineering features of the delivery system. These are real engineering interactions and are truly 3D. This is an example of a new requirement that we have to meet, a new problem that we have to solve, and we simply can’t rely on two dimensional simulations to get at. And El Capitan is being delivered just in time to address this problem.”
Although in response to my question Goldstein declined to provide a concrete example of a 3D versus 2D research question or result, citing the classified nature of the work, it’s clear that his remarks are meant to be taken both literally and figuratively. The depth, so to speak, of factors affecting a nuclear weapons system may be said to have been much flatter in the ’90s, when we lacked the computing resources to do the complex physics simulations that might inform their design. So both conceptually and spatially the design process has expanded.
That said, let’s be clear: “warhead and delivery systems” means nukes, and that is what this $600 million supercomputer will be dedicated to.
There’s a silver lining there: Before being air-gapped and entering into its classified operations, El Capitan will have a “shakeout period” during which others will have access to it. So while for most of its life it will be hard at work on weapons systems, during its childhood it will be able to experience a wider breadth of scientific problems.
The exact period of time and who will have access to it is to be determined (this is still three years out), but it’s not an afterthought to quiet jealous researchers. The team needs to get used to the tools and work with Cray to refine the system before it moves on to the top secret stuff. And opening it up to a variety of research problems and methods is a great way to do it, while also providing a public good.
Yet Goldstein referred to the 3D simulations of nuclear weapons physics as the “killer app” of the new computer system. Perhaps not the phrase I would have chosen. But it’s hard to deny the importance of making sure the nuclear stockpile is functional and not leaking or falling apart — I just wish the most powerful computer ever planned had a bit more noble of a purpose.
Source: TechCrunch
 

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker