Skip to content Skip to navigation

Research & Ideas

Search this site

Interdisciplinary team develops ‘exascale’ computing for spaceflight

Stanford engineers are part of a nationwide effort to build a next-generation hardware and software system to provide the ultimate assurance of safety when space rockets blast off.

Saturn V Rocket

What will happen at the critical instant a rocket engine ignites? Stanford engineers will help answer that question. | Image courtesy of Javier Urzay

Putting humans on another celestial body is among the most complex and noble pursuits society has ever undertaken, but it doesn’t come quickly or cheaply.

Imagine, then, the precise moment that all that effort and expense — everything — is riding on the success of a single split-second laser blast that starts the rocket engine. “It’s a decisive moment for the entire journey. If the laser fails, the entire mission, including the lives of the astronauts, will be lost,” says project director Gianluca Iaccarino, a professor of mechanical engineering and director of the Institute for Computational and Mathematical Engineering at Stanford. “We have to be as close to 100 percent certain of success as we can possibly be.”

So now, in a 5-year, $16.5 million effort funded by the U.S. Department of Energy through its National Nuclear Security Administration, researchers at Stanford Engineering, Purdue University and the University of Colorado at Boulder are seeking to deploy the most powerful computing capabilities in the world to study and predict the crucial outcome of that decisive moment.

In particular, to ensure that the laser never fails, the collaborators will develop sophisticated mathematical models and software running on so-called “exascale” computers, which are capable of a quintillion (a number equal to 1 followed by 18 zeros) calculations per second.

The project continues an uninterrupted investment that DOE has made at Stanford during the last 20 years to develop multidisciplinary computational science and foster collaborations between engineers and computer scientists to combine innovations in algorithms, physics simulations and programming models.

A new approach

As simple as it sounds, ignition technology is incredibly complex and involves expertise in many physical phenomena, including fluid dynamics, laser-induced energy deposition, thermodynamics, turbulence, mixing and combustion. Add in the complexities introduced by operating in the vacuum of space, and it amounts to a monumental challenge for engineers, says Javier Urzay, the associate director of the project and senior research engineer at the Stanford Center for Turbulence Research.

When the Apollo missions traveled to the moon, they carried extra propellant used to lift the lunar lander off the moon’s silvery surface and return safely home. But the fuels for those missions were highly volatile and toxic. Just having them on the spacecraft is a risk. Scientists have long sought a new approach. Recently, the spotlight has shined upon liquid oxygen and methane (natural gas), which are nontoxic but less reactive, requiring a new approach to ignition. In this context, the laser acts like the spark plug in an internal combustion engine. “It lights the fire,” says Iaccarino. But if it fails, adds Urzay, “the spacecraft is stranded in space without any way to propel itself home.”

So far, the new ignition technology has not been tested in the blast furnace of a real-world rocket engine — there hasn’t been any manned mission to another celestial body since the 1970s. So the engineers will need to rely on computers to simulate the technology. But, with stakes high and the need for surety paramount, even the present world’s fastest computers won’t do. So the team plans to rewrite the models using a new programming language known as Regent that works in combination with a software compiler and runtime system called Legion — both of which were developed at Stanford — to achieve more seamless performance from next-generation supercomputers. Legion was recently recognized as an R&D 100 award winner in an international competition to identify the world’s top new technologies.

The team plans to use FrontierAurora and El Capitan, the Department of Energy’s new exascale supercomputers now in production. Those systems are able to handle both traditional high-performance computing and artificial intelligence computations to provide researchers an unprecedented set of tools to address increasingly complex scientific problems.

Total Collaboration

Iaccarino notes that the team of researchers assembled at the three institutions is as interdisciplinary as he’s ever worked with. The predictive modeling work will be done at Stanford. The team at Purdue will conduct small-scale, real-world tests of the ignition technology. The team in Boulder will measure the predictions against the tests.

Highlighting the extraordinary collaboration, Iaccarino says he dubbed the project INtegrated SImulations using Exascale Multiphysics Ensembles (INSIEME) and points out that insieme means “together” in his native Italian. “You have to learn as you go and do it all together. It is a total collaboration,” he says. “And I think that’s really what’s exciting about the project.”