SPEChpc(TM) 2021 Tiny Result Supermicro SuperServer SYS-F511E2-RT (Intel Xeon Gold 5433N) hpc2021 License: 6569 Test date: Apr-2023 Test sponsor: Supermicro Hardware availability: Jan-2023 Tested by: Supermicro Software availability: Jan-2023 Base Base Thrds Base Base Peak Peak Thrds Peak Peak Benchmarks Model Ranks pr Rnk Run Time Ratio Model Ranks pr Rnk Run Time Ratio -------------- ------ ------ ------ --------- --------- ------ ------ ------ --------- --------- 505.lbm_t MPI 160 1 214 10.5 S 505.lbm_t MPI 160 1 210 10.7 S 505.lbm_t MPI 160 1 211 10.6 * 513.soma_t MPI 160 1 343 10.8 * 513.soma_t MPI 160 1 344 10.8 S 513.soma_t MPI 160 1 343 10.8 S 518.tealeaf_t MPI 160 1 251 6.57 * 518.tealeaf_t MPI 160 1 250 6.61 S 518.tealeaf_t MPI 160 1 252 6.56 S 519.clvleaf_t MPI 160 1 211 7.83 S 519.clvleaf_t MPI 160 1 208 7.93 * 519.clvleaf_t MPI 160 1 207 7.95 S 521.miniswp_t MPI 160 1 389 4.11 S 521.miniswp_t MPI 160 1 386 4.14 S 521.miniswp_t MPI 160 1 388 4.13 * 528.pot3d_t MPI 160 1 362 5.87 * 528.pot3d_t MPI 160 1 361 5.89 S 528.pot3d_t MPI 160 1 363 5.86 S 532.sph_exa_t MPI 160 1 182 10.7 S 532.sph_exa_t MPI 160 1 183 10.7 S 532.sph_exa_t MPI 160 1 183 10.7 * 534.hpgmgfv_t MPI 160 1 232 5.07 S 534.hpgmgfv_t MPI 160 1 218 5.39 * 534.hpgmgfv_t MPI 160 1 215 5.47 S 535.weather_t MPI 160 1 185 17.5 S 535.weather_t MPI 160 1 185 17.4 * 535.weather_t MPI 160 1 185 17.4 S ============================================================================================================ 505.lbm_t MPI 160 1 211 10.6 * 513.soma_t MPI 160 1 343 10.8 * 518.tealeaf_t MPI 160 1 251 6.57 * 519.clvleaf_t MPI 160 1 208 7.93 * 521.miniswp_t MPI 160 1 388 4.13 * 528.pot3d_t MPI 160 1 362 5.87 * 532.sph_exa_t MPI 160 1 183 10.7 * 534.hpgmgfv_t MPI 160 1 218 5.39 * 535.weather_t MPI 160 1 185 17.4 * SPEChpc 2021_tny_base 8.07 SPEChpc 2021_tny_peak Not Run BENCHMARK DETAILS ----------------- Type of System: Homogenous Compute Nodes Used: 4 Total Chips: 4 Total Cores: 80 Total Threads: 160 Total Memory: 2 TB Compiler: C/C++/Fortran: Version 2023.0.0 of Intel oneAPI Compiler MPI Library: Version 2021.8.0 Build 20221129 Other MPI Info: None Other Software: None Base Parallel Model: MPI Base Ranks Run: 160 Base Threads Run: 1 Peak Parallel Models: Not Run Node Description: SuperServer SYS-F511E2-RT =========================================== HARDWARE -------- Number of nodes: 4 Uses of the node: compute Vendor: Supermicro Model: SuperServer SYS-F511E2-RT CPU Name: Intel Xeon Gold 5433N CPU(s) orderable: 1 chip Chips enabled: 1 Cores enabled: 20 Cores per chip: 20 Threads per core: 2 CPU Characteristics: Intel Turbo Boost Technology up to 4.1 GHz CPU MHz: 1900 Primary Cache: 32 KB I + 48 KB D on chip per core Secondary Cache: 2 MB I+D on chip per core L3 Cache: 37.5 MB I+D on chip per chip Other Cache: None Memory: 512 GB (8 x 64 GB 2Rx4 PC5-4800B-R) Disk Subsystem: 1 x 480 GB Micron M.2 NVMe SSD Other Hardware: None Accel Count: 0 Accel Model: None Accel Vendor: None Accel Type: None Accel Connection: None Accel ECC enabled: None Accel Description: None Adapter: Supermicro AOC-ATG-i2TM Number of Adapters: 1 Slot Type: Advanced I/O Module (AIOM) Form Factor Data Rate: 10 Gb/s Ports Used: 1 Interconnect Type: AOC-ATG-i2TM SOFTWARE -------- Adapter: Supermicro AOC-ATG-i2TM Adapter Driver: None Adapter Firmware: None Operating System: SUSE Linux Enterprise Server 15 SP4 Kernel 5.14.21-150400.22-default Local File System: xfs Shared File System: None System State: Multi-user, run level 3 Other Software: None Interconnect Description: Supermicro AOC-ATG-i2TM ================================================= HARDWARE -------- Vendor: None Model: Supermicro AOC-ATG-i2TM Switch Model: None Number of Switches: 1 Number of Ports: 0 Data Rate: 10 Gb/s Firmware: None Topology: None Primary Use: MPI Traffic, NFS Access SOFTWARE -------- Submit Notes ------------ The config file option 'submit' was used. mpiexec.hydra -bootstrap ssh -hostfile $[top]/hostfile -genv OMP_NUM_THREADS $threads -np $ranks -ppn $ppn $command General Notes ------------- MPI startup command: mpirun command (mpiexec.hydra) was used to start MPI jobs. Compiler Version Notes ---------------------- ============================================================================== CXXC 532.sph_exa_t(base) ------------------------------------------------------------------------------ Intel(R) oneAPI DPC++/C++ Compiler 2023.0.0 (2023.0.0.20221201) Target: x86_64-unknown-linux-gnu Thread model: posix InstalledDir: /opt/intel/oneapi/compiler/2023.0.0/linux/bin-llvm Configuration file: /opt/intel/oneapi/compiler/2023.0.0/linux/bin-llvm/../bin/icpx.cfg ------------------------------------------------------------------------------ ============================================================================== CC 505.lbm_t(base) 513.soma_t(base) 518.tealeaf_t(base) 521.miniswp_t(base) 534.hpgmgfv_t(base) ------------------------------------------------------------------------------ Intel(R) oneAPI DPC++/C++ Compiler 2023.0.0 (2023.0.0.20221201) Target: x86_64-unknown-linux-gnu Thread model: posix InstalledDir: /opt/intel/oneapi/compiler/2023.0.0/linux/bin-llvm Configuration file: /opt/intel/oneapi/compiler/2023.0.0/linux/bin-llvm/../bin/icx.cfg ------------------------------------------------------------------------------ ============================================================================== FC 519.clvleaf_t(base) 535.weather_t(base) ------------------------------------------------------------------------------ ifx (IFORT) 2023.0.0 20221201 Copyright (C) 1985-2022 Intel Corporation. All rights reserved. ------------------------------------------------------------------------------ ============================================================================== FC 528.pot3d_t(base) ------------------------------------------------------------------------------ ifx: command line warning #10157: ignoring option '-W'; argument is of wrong type ifx (IFORT) 2023.0.0 20221201 Copyright (C) 1985-2022 Intel Corporation. All rights reserved. ------------------------------------------------------------------------------ Base Compiler Invocation ------------------------ C benchmarks: mpiicc -cc=icx C++ benchmarks: mpiicpc -cxx=icpx Fortran benchmarks: mpiifort -fc=ifx Base Portability Flags ---------------------- 505.lbm_t: -lstdc++ 513.soma_t: -lstdc++ -DSPEC_NO_VAR_ARRAY_REDUCE 518.tealeaf_t: -lstdc++ 519.clvleaf_t: -lstdc++ 521.miniswp_t: -lstdc++ 528.pot3d_t: -DSPEC_NO_REORDER -lstdc++ 532.sph_exa_t: -lstdc++ 534.hpgmgfv_t: -lstdc++ 535.weather_t: -lstdc++ Base Optimization Flags ----------------------- C benchmarks: -Ofast -ipo -xCORE-AVX512 -mprefer-vector-width=512 -ansi-alias C++ benchmarks: -Ofast -ipo -xCORE-AVX512 -mprefer-vector-width=512 -ansi-alias Fortran benchmarks: 519.clvleaf_t: -Ofast -ipo -xCORE-AVX512 -mprefer-vector-width=512 -nostandard-realloc-lhs -align array64byte 528.pot3d_t: -Ofast -ipo -xCORE-AVX512 -mprefer-vector-width=512 -nostandard-realloc-lhs -align array64byte -heap-arrays 32768 535.weather_t: Same as 519.clvleaf_t Base Other Flags ---------------- C benchmarks (except as noted below): -Ispecmpitime 521.miniswp_t: -Ispecmpitime/ 534.hpgmgfv_t: -Ispecmpitime C++ benchmarks: -Ispecmpitime Fortran benchmarks: 519.clvleaf_t: -Ispecmpitime 528.pot3d_t: -Wno-incompatible-function-pointer-types 535.weather_t: No flags used The flags file that was used to format this result can be browsed at http://www.spec.org/hpc2021/flags/Intel_compiler_flags.2023-06-05.html You can also download the XML flags source by saving the following link: http://www.spec.org/hpc2021/flags/Intel_compiler_flags.2023-06-05.xml SPEChpc is a trademark of the Standard Performance Evaluation Corporation. All other brand and product names appearing in this result are trademarks or registered trademarks of their respective holders. ----------------------------------------------------------------------------------------------------------------------- For questions about this result, please contact the tester. For other inquiries, please contact info@spec.org. Copyright 2021-2023 Standard Performance Evaluation Corporation Tested with SPEChpc2021 v1.1.7 on 2023-04-26 16:25:33-0400. Report generated on 2023-06-05 11:41:45 by hpc2021 ASCII formatter v1.0.3. Originally published on 2023-06-05.