Ctrl K

exaCB

The exacb framework is a set of tools and configurations to enable Continuous Benchmarking (CB) on Supercomputers. With a given target application, it provides an easy common entry point to monitor application performance and collect data in a uniform manner.

3
contributors

Description

exaCB is a framework for continuous benchmarking on HPC systems, motivated by the fact that modern platforms are heterogeneous and the software stacks are complex and fast-moving. A small set of classic benchmarks no longer represents the range of real applications, and it often misses the effects of site-specific builds, dependency choices, and configuration details. At the same time, maintaining a large and diverse benchmark set as a mostly manual, center-owned activity does not scale, especially if it is disconnected from upstream development. exaCB therefore treats benchmarking as part of normal software engineering practice: automated, repeatable, and integrated into everyday workflows via CI/CD.

In practice, we have applied exaCB in JUREAP, the early-access program preparing applications for the JUPITER exascale system, where it was used to execute benchmarks for a collection of 70+ applications across diverse domains under an evolving system environment. Runs were orchestrated automatically through shared CI/CD components, and results were collected in a uniform, machine-readable format, enabling cross-application analyses with minimal additional effort. In parallel, we are integrating the JUPITER Benchmark Suite, which consists of sixteen application benchmarks and seven synthetic benchmarks with extensive JUBE workflows and reference results, into exaCB. The goal is to make procurement-grade benchmarks continuously reproducible in the same automated workflows.

Operationally, exaCB organizes benchmarks in regular Git repositories and runs them through CI/CD pipelines, so benchmark execution and evaluation become routine rather than occasional events. The framework combines command-line tools with reusable CI/CD building blocks, currently implemented for GitLab CI/CD, and integrates established benchmarking harnesses such as JUBE and ReFrame. Benchmark runs produce structured, machine-readable results that are post-processed into reports which can be compared across systems and over time, supporting regression detection and longer-term trend analysis. A key principle is incremental adoption: benchmarks can start at a minimal runnability level and then mature toward richer measurements and stronger reproducibility as they evolve. exaCB also supports framework-level workload augmentation, so common experiments such as non-invasive instrumentation via injected wrappers for energy measurements can be applied consistently across many benchmarks through shared platform configuration.

Logo of exaCB
Keywords
License
</>Source code

Participating organisations

Forschungszentrum Jülich

Contributors

JB
Jayesh Badwaik
AH
Andreas Herten
MB
Mathis Bode

Helmholtz Program-oriented Funding IV