[Due to the increasing size of the archives, each topic page now contains only the prior 365 days of content. Access to older stories is now solely through the Monthly Archive pages or the site search function.]
Sandia RAPTOR turbulent combustion code selected for next-gen Summit supercomputer readiness project
May 28, 2015
RAPTOR, a turbulent combustion code developed by Sandia National Laboratories mechanical engineer Dr. Joseph Oefelein, was selected as one of 13 partnership projects for the Center for Accelerated Application Readiness (CAAR). CAAR is a US Department of Energy program located at the Oak Ridge Leadership Computing Facility and is focused on optimizing computer codes for the next generation of supercomputers.
Developed at Sandia’s Combustion Research Facility, RAPTOR, a general solver optimized for Large Eddy Simulation (LES, a mathematical model for turbulence), is targeted at transportation power and propulsion systems. Optimizing RAPTOR for Summit’s hybrid architecture will enable a new generation of high-fidelity simulations that identically match engine operating conditions and geometries. Such a scale will allow direct comparisons to companion experiments, providing insight into transient combustion processes such as thermal stratification, heat transfer, and turbulent mixing.
Argonne supercomputer helped Rice/Minnesota team identify materials to improve fuel production
April 29, 2015
Scientists at Rice University and the University of Minnesota recently identified, through a large-scale, multi-step computational screening process, promising zeolite structures for two fuel applications: purification of ethanol from fermentation broths and the hydroisomerization of alkanes with 18–30 carbon atoms encountered in petroleum refining. (Earlier post.)
To date, more than 200 types of zeolites have been synthesized and more than 330,000 potential zeolite structures have been predicted based on previous computer simulations. With such a large pool of candidate materials, using traditional laboratory methods to identify the optimal zeolite for a particular job presents a time- and labor-intensive process that could take decades. The researchers used Mira, the Argonne Leadership Computing Facility’s (ALCF) 10-petaflops IBM Blue Gene/Q supercomputer, to run their large-scale, multi-step computational screening process.
DOE investing $200M in next-gen supercomputer for Argonne; on the road to exascale computing
April 10, 2015
Under the joint Collaboration of Oak Ridge, Argonne, and Lawrence Livermore (CORAL) initiative, the US Department of Energy (DOE) will invest $200 million to deliver a next-generation supercomputer—Aurora—to the Argonne Leadership Computing Facility (ALCF). When commissioned in 2018, this supercomputer will be open to all scientific users.
The new system, Aurora, will use Intel’s HPC (high performance computing) scalable system framework to provide a peak performance of 180 PetaFLOP/s. Aurora, in effect a “pre-exascale” system, will be delivered in 2018. Argonne and Intel will also provide an interim system, the 8.5 PetaFLOP Theta, to be delivered in 2016, which will help Argonne Leadership Computing Facility (ALCF) users transition their applications to the new technology. (Theta will require only 1.7 MW of power.)