Supercomputing Our Way To A Clean Energy Future

Editor’s Note: EarthTechling is proud to repost this article courtesy of the U.S. Department of Energy. Author credit goes to Michael Hess.

These days supercomputing isn’t just for niche applications like unlocking the secrets of dark matter, finding the Higgs boson particle, and helping us understand nuclear weapons without explosive testing. With recent strides in technology and a number of high-profile success stories, advanced computing technology is catching the attention of major companies looking to lower their research and development costs while producing more efficient and more powerful energy technology.

Recently at the Workshop on the Grand Challenges of Advanced Computing for Energy Innovation near Washington, DC, computing specialists from the private sector, National Labs and academia met to share best practices, discuss trends, and determine the future of computing in energy technology.

truck computer model

image via LLNL Livermore Valley Open Campus

Computer-assisted design (CAD) software took engineers from the drawing board to the keyboard decades ago, but the bulk of variable testing still takes place with prototype models attached to sensors that generate a lot of data that needs to be analyzed. What if engineers could develop a virtual prototype and test it under every conceivable condition on a system-wide basis? With help from the National Labs, energy technology companies are doing just that, and recent collaborative projects and programs have benefitted both the labs and companies.

At the workshop, the truck manufacturer Navistar reported significant advances in improving airflow to its vehicles, which increases fuel efficiency and durability. Instead of using expensive wind tunnel testing, Navistar used modeling and simulation software from Lawrence Livermore National Laboratory (LLNL) to make improvements for a fraction of traditional research costs.

Other major companies like Westinghouse, IBM, and General Electric reported similar positive results from their partnerships with the National Labs. Everyone benefits when the private sector can easily leverage advanced computing resources and talent to build the next generation of nuclear reactors, computer software, diesel trucks, and turbines that will help unlock the promise of a clean energy future.

For researchers at the labs, such partnerships are about applying of their scientific discoveries, advancing the state of the art in their fields, and solving real-world problems with far-reaching impact on competitiveness, energy technology, and the economy. Partnerships also can give scientists the opportunity to test and validate models or algorithms they’ve developed.

The Consortium for Advanced Simulations in Light Water Reactors (CASL) is a paragon of private-public partnership in high-performance modeling and simulation. Powered by the sixth fastest supercomputer in the world, CASL provides researchers with an unprecedented virtual look at how fluids behave in the center of a nuclear core. Instead of building a machine and then testing it in real-world conditions, researchers can feed variables into the virtual machine to see what will happen. This research is informing the design of the next generation of safe and secure nuclear reactors.

The findings from the Grand Challenges workshop will be published in a report in November that will describe the current state of industrial high-performance computing applications, viable next steps for select industries, and best practices for overcoming obstacles of integration into current R&D processes.

The mission of the Energy Department is to ensure America’s security and prosperity by addressing its energy, environmental and nuclear challenges through transformative science and technology solutions