Noc­tua 2 - Tech­nic­al de­scrip­tion

The overall setup of the Noctua 2 supercomputer with insights into the operation of the cluster from a hardware, software and facility perspective is described in the Noctua 2 paper.
Here you can find the inauguration video of Noctua 2.
Apply for free access to our supercomputers for research purposes. Details here.

SystemAtos BullSequana XH2000
Number of Processor Cores143.872
Number of GPUs136
Total Main Memory347,5 TiB
Floating-Point PerformanceCPU: 5.4 PFLOPS DP Peak (4.19 PFlop/s Linpack)
GPU: 2.49 PFLOPS DP Tensor Core Peak (ca. 1.7 PFlop/s Linpack)
Cabinets12 racks - direct liquid cooling,
7 racks - air cooling, four of them with active backdoor cooling
Compute Nodes

990 nodes, each with

    • 2x AMD Milan 7763, 2.45 GHz, up to 3.5 GHz

       • 2x 64 cores

    • 256 GiB main memory

Compute Nodes (Large Memory)

66 nodes, each with

    • 2x AMD Milan 7763, 2.45 GHz, up to 3.5 GHz

       • 2x 64 cores

    • 1024 GiB main memory

Compute Nodes (Huge Memory)

5 nodes, each with

    • 2x AMD Milan 7713, 2.0 GHz, , up to  3.675 GHz

       • 2x 64 Cores

    • 2 TiB main memory

    • 34 TiB SSD-based memory

       • 12x 3.2 TB NVMe SSDs, ~70 GB/s

GPU Nodes

32 nodes, each with

    • 2x AMD Milan 7763, 2.45 GHz, up to 3.5 GHz

       • 2x 64 cores

    • 512 GiB main memory

    • 4x NVIDIA A100 with NVLink and 40 GB HBM2

NVIDIA DGX-A100

one node

    • 2x AMD EPYC Rome 7742, 2.25 GHz, up to 3.4 GHz

        • 2x64 cores

    • 1024 GiB main memory

    • 8x NVIDIA A100 with NVLink and 40GB HBM2

FPGA Nodes

36 FPGA nodes, each with

    • 2x AMD Milan 7713, 2.0 GHz, up to  3.675 GHz

        • 2x 64 cores

    • 512 GiB main memory

80 FPGA cards

    • 48x Xilinx Alveo U280 FPGA with 8GiB HBM2 and 32GiB DDR memory

    • 32x Intel Stratix 10 GX 2800 FPGA with 32 GiB DDR memory (Bittware 520N cards)

HACC Nodes

3 HACC nodes, each with

    • 2x Xilinx Alveo U55C
    • 2x Xilinx VCK5000
    • 4x AMD Instinct MI210

Communication Network CPUsMellanox InfiniBand 100/200 HDR, 2:1 blocking factor
Communication Network FPGAs

Application-specific interconnect

    • 4x40 Gbps links per FPGA

    • Connected via CALIENT S320 Optical Circuit Switch (OCS)

    • Configurable point-to-point connections to any other FPGA

    • Accessible as serial channels from OpenCL BSP

Storage SystemDDN Exascaler 7990X with NVMe accelerator
Lustre File System with 6 PB capacity

In the Top500 list 06/2022, Noctua 2 is on position #121.
In the Green500 list 06/2022, Noctua 2 is on position #82.