Argonne National Laboratory (ANL), a scientific center under the US Department of Energy, has completed the installation of the Aurora supercomputer, which is poised to be one of the most powerful supercomputers in the world. The supercomputer, built on the CRAY and 3rd generation Intel Xeon PHI processor platforms, can perform over two exaflops of computing power, which is equivalent to two quintillion or billions of billions of operations per second.
The Aurora supercomputer will be utilized to solve complex scientific and engineering problems related to physics, biology, chemistry, medicine, energy, and the environment. Additionally, it will also facilitate research on artificial intelligence and machine learning. The US scientific community and international partners will have access to the Aurora supercomputer.
The installation of the Aurora supercomputer began in 2018 and was completed by the end of 2022. As of now, the ANL team is testing and debugging the system, and they are also preparing applications for full-power launching. The Aurora supercomputer is expected to be fully ready for work in 2023.
The Aurora supercomputer consists of 10,624 blade servers situated among 166 racks, each containing two Intel Xeon Max processors with 40 nuclei each and six Intel Max Series graphic processors. The racks are arranged in eight rows, occupying an area equivalent to two professional basketball fields in the data center Alcf. Each blade server produces 130 teraflops of computing power. In total, the Aurora system holds 21,248 Intel Xeon CPU MAX processors Sapphire Rapids and 63,744 GPUs of Intel Data Center GPU Max Ponte Vecchio.
With approximately 10 petabytes of RAM and 230 discs, the Aurora supercomputer runs on the Cray Slingshot fiber-optic network, providing 200 Gbit/s of bandwidth. The supercomputer’s energy consumption is estimated to be 60 megavatts.
ANL’s announcement regarding Aurora’s significant installation can be found in their article. For more technical information, please refer to the YouTube video.