Datasets:
source stringclasses 13
values | gpu stringclasses 4
values | model stringclasses 13
values | kernel_family stringclasses 7
values | kernel_name stringclasses 328
values | dtype stringclasses 2
values | held_out bool 2
classes | M float64 1 16.4k ⌀ | N float64 8 248k ⌀ | K float64 2.05k 29.6k ⌀ | bs float64 1 1 ⌀ | seq float64 1 16.4k ⌀ | n_heads float64 8 64 ⌀ | head_dim float64 64 256 ⌀ | kv_heads float64 1 64 ⌀ | numel float64 64 531M ⌀ | op_type stringclasses 177
values | gpu_time_duration_ms float64 0 703 | launch_block_size float64 16 512 | launch_grid_size float64 1 1.04M | dram_bytes_sum float64 1.27 94.3B ⌀ | launch_registers_per_thread float64 16 255 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | void gemmk1_kernel<int, float, 256, 5, 0, 0, 0, 0, cublasGemvTensorStridedBatched<const float>, cublasGemvTensorStridedBatched<const float>, cublasGemvTensorStridedBatched<float>, float, 0>(cublasGemmk1Params<T2, T9, T10, T11, T12, biasType<T11::value_type, T12>::type>) | bf16 | true | null | null | null | 1 | 128 | null | null | null | null | gemv | 0.00458 | 256 | 8 | 6,910 | 20 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | q_proj | 0.13526 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | k_proj | 0.02554 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | v_proj | 0.02528 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | o_proj | 0.13965 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | gate | 0.58064 | 256 | 112 | 475,470,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | up | 0.59558 | 256 | 112 | 475,510,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 28,672 | 1 | 128 | null | null | null | null | down_proj | 0.38464 | 256 | 96 | 477,130,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | q_proj | 0.13949 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | k_proj | 0.0256 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | v_proj | 0.02534 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | o_proj | 0.14077 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | gate | 0.58614 | 256 | 112 | 475,850,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | up | 0.58227 | 256 | 112 | 476,440,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 28,672 | 1 | 128 | null | null | null | null | down_proj | 0.3881 | 256 | 96 | 477,150,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | q_proj | 0.13898 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | k_proj | 0.02554 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | v_proj | 0.02518 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | o_proj | 0.13338 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | gate | 0.60003 | 256 | 112 | 475,280,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | up | 0.59392 | 256 | 112 | 474,630,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 28,672 | 1 | 128 | null | null | null | null | down_proj | 0.38301 | 256 | 96 | 477,190,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | q_proj | 0.14259 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | k_proj | 0.0256 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | v_proj | 0.02528 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | o_proj | 0.13763 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | gate | 0.59203 | 256 | 112 | 474,990,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | up | 0.5912 | 256 | 112 | 475,320,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 28,672 | 1 | 128 | null | null | null | null | down_proj | 0.38675 | 256 | 96 | 477,130,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | q_proj | 0.13395 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | k_proj | 0.02509 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | v_proj | 0.02525 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | o_proj | 0.13795 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | gate | 0.58624 | 256 | 112 | 475,870,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | up | 0.59174 | 256 | 112 | 475,710,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 28,672 | 1 | 128 | null | null | null | null | down_proj | 0.38342 | 256 | 96 | 477,130,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | q_proj | 0.13373 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | k_proj | 0.02579 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | v_proj | 0.02509 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | o_proj | 0.1384 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | gate | 0.58982 | 256 | 112 | 475,880,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | up | 0.58886 | 256 | 112 | 475,790,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 28,672 | 1 | 128 | null | null | null | null | down_proj | 0.37894 | 256 | 96 | 477,130,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | q_proj | 0.13363 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | k_proj | 0.02573 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | v_proj | 0.02525 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | o_proj | 0.13872 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | gate | 0.58163 | 256 | 112 | 475,630,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | up | 0.58531 | 256 | 112 | 475,660,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 28,672 | 1 | 128 | null | null | null | null | down_proj | 0.38106 | 256 | 96 | 477,130,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | q_proj | 0.13936 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | k_proj | 0.02506 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | v_proj | 0.02522 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | o_proj | 0.13869 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | gate | 0.58349 | 256 | 112 | 476,720,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | up | 0.59296 | 256 | 112 | 475,500,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 28,672 | 1 | 128 | null | null | null | null | down_proj | 0.388 | 256 | 96 | 477,130,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | q_proj | 0.14176 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | k_proj | 0.02554 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | v_proj | 0.02512 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | o_proj | 0.13955 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | gate | 0.58605 | 256 | 112 | 475,830,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | up | 0.59654 | 256 | 112 | 475,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 28,672 | 1 | 128 | null | null | null | null | down_proj | 0.38694 | 256 | 96 | 477,130,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | q_proj | 0.13158 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | k_proj | 0.02515 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | v_proj | 0.02493 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | o_proj | 0.13603 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | gate | 0.58077 | 256 | 112 | 474,930,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | up | 0.59629 | 256 | 112 | 475,840,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 28,672 | 1 | 128 | null | null | null | null | down_proj | 0.38144 | 256 | 96 | 477,130,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | q_proj | 0.13498 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | k_proj | 0.02547 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | v_proj | 0.02512 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | o_proj | 0.1369 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | gate | 0.59056 | 256 | 112 | 475,900,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | up | 0.58938 | 256 | 112 | 475,640,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 28,672 | 1 | 128 | null | null | null | null | down_proj | 0.38128 | 256 | 96 | 477,140,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | q_proj | 0.13917 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | k_proj | 0.02515 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | v_proj | 0.0249 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | o_proj | 0.13974 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | gate | 0.58096 | 256 | 112 | 475,640,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | up | 0.58496 | 256 | 112 | 475,360,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 28,672 | 1 | 128 | null | null | null | null | down_proj | 0.38742 | 256 | 96 | 477,130,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | q_proj | 0.14 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | k_proj | 0.02541 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | v_proj | 0.02496 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | o_proj | 0.13894 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | gate | 0.58774 | 256 | 112 | 476,280,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | up | 0.59514 | 256 | 112 | 475,470,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 28,672 | 1 | 128 | null | null | null | null | down_proj | 0.38282 | 256 | 96 | 477,130,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | q_proj | 0.13574 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | k_proj | 0.0255 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_128x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 1,024 | 8,192 | 1 | 128 | null | null | null | null | v_proj | 0.02493 | 128 | 72 | 18,900,000 | 246 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | o_proj | 0.14115 | 256 | 96 | 136,340,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | gate | 0.58477 | 256 | 112 | 475,870,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 28,672 | 8,192 | 1 | 128 | null | null | null | null | up | 0.58832 | 256 | 112 | 475,040,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 28,672 | 1 | 128 | null | null | null | null | down_proj | 0.39146 | 256 | 96 | 477,140,000 | 238 |
Llama-3.1-70B-Instruct_prefill | A100 | Llama-70B | gemm | ampere_bf16_s16816gemm_bf16_256x128_ldg8_f2f_stages_64x3_tn | bf16 | true | 128 | 8,192 | 8,192 | 1 | 128 | null | null | null | null | q_proj | 0.13014 | 256 | 96 | 136,350,000 | 238 |
AgentPerfBench
LLM inference benchmark: 3,197 main sweep rows and 37 per-layer kernel validation rows, plus 148,077 per-kernel NCU profiles, across 9 models, 14 GPU configurations, and 2 serving engines (vLLM 0.19.0, SGLang 0.5.9). All models served in BF16 except gpt-oss, which uses mxfp4 for projection weights.
Dataset configurations
trace_replay (2,932 rows)
Replays exact ISL/OSL sequences from recorded agent sessions (SWE-Bench, TerminalBench, OSWorld, ShareGPT). 77 unique (model, hardware, engine) combinations across 17 profiles.
17 profiles: chat-medium, chat-multiturn-long, chat-multiturn-medium, chat-multiturn-short, chat-short, chat-singleturn, coding-singleturn, decode-heavy, osworld-multiturn-long, osworld-multiturn-medium, osworld-multiturn-short, prefill-heavy, random-1k, swebench-multiturn-medium, swebench-multiturn-short, terminalbench-multiturn-medium, terminalbench-multiturn-short
synthetic_distributional (265 rows)
ISL/OSL sampled from lognormal fits to real workload statistics. 38 unique (model, hardware, engine) combinations across 5 profiles.
5 profiles: chat-multiturn-synth, chat-singleturn-synth, osworld-multiturn-synth, swebench-multiturn-synth, terminalbench-multiturn-synth
per_layer_kernel (37 rows)
Per-component operational intensity decomposition and Nsight Compute kernel profiles for Llama-3.1-8B on H100 (prefill phase). Analytical rows provide computed FLOPs, bytes, and OI per model component at batch sizes 1 and 80. NCU rows report measured SM and memory throughput per kernel from an 8-layer forward pass. Record types: analytical_total, analytical_component, ncu_kernel.
kernels_labeled (148,077 rows)
Per-kernel Nsight Compute (ncu) profiles across 4 GPUs (A100, H100, RTX 3090, RTX 2080 Ti) and 13 model/sweep sources.
mse_validation (28 rows)
Curated H100 / Llama-3.1-8B / vLLM validation table for the distributional synthetic replay generator. Paired synthetic and real trace replay runs; supplementary rows preserve no-replacement and high-concurrency debug runs. Raw JSON artifacts referenced through R2 URI columns. Per-run successful/failed request counts retained.
Quality filtering
Configurations where fewer than 75% of requests completed successfully are excluded. Summary metrics are computed from successful requests only.
| Config | Rows |
|---|---|
| trace_replay | 2,932 |
| synthetic_distributional | 265 |
| per_layer_kernel | 37 |
| kernels_labeled | 148,077 |
| mse_validation | 28 |
Coverage
Hardware
All benchmarks collected on PyTorch 2.10.0.
| GPU | VRAM | HBM bandwidth | Peak half-precision TFLOPS |
|---|---|---|---|
| NVIDIA H100 SXM | 80 GB | 3.35 TB/s | 989 |
| NVIDIA A100 SXM4 | 40 GB | 1.56 TB/s | 312 |
| NVIDIA RTX 3090 | 24 GB | 936 GB/s | 71 |
| NVIDIA RTX 2080 Ti | 11 GB | 616 GB/s | 27 |
Multi-GPU configurations: 1, 2, 4, or 8 GPUs with tensor parallelism.
Models
All models served in BF16 unless noted.
| Model | Family | Parameters | Architecture | Notes |
|---|---|---|---|---|
| Llama-3.1-8B | Llama | 8B | Dense | |
| Llama-3.1-70B | Llama | 70B | Dense | |
| Llama-3.3-70B | Llama | 70B | Dense | |
| Qwen2.5-72B | Qwen | 72B | Dense | |
| Qwen3.5-9B | Qwen | 9B | Dense | |
| Qwen3.5-27B | Qwen | 27B | Dense | |
| Mixtral-8x7B | Mixtral | 46.7B (12.9B active) | MoE | |
| gpt-oss-20b | GPT-OSS | 21B (3.6B active) | MoE | mxfp4 projections |
| gpt-oss-120b | GPT-OSS | 117B (5.1B active) | MoE | mxfp4 projections |
Engines
- vLLM 0.19.0
- SGLang 0.5.9
Schema
Each row in summary.parquet (trace_replay and synthetic_distributional):
| Column | Type | Description |
|---|---|---|
| run_id | string | Deterministic hash of run parameters |
| model | string | Model short name |
| model_family | string | Model family (llama, qwen, gpt-oss, mixtral) |
| hardware | string | GPU configuration (e.g., H100x4) |
| engine | string | Serving engine (vllm, sglang) |
| tensor_parallelism | int | TP degree |
| profile | string | Workload profile name |
| concurrency | int | Concurrent request level |
| num_requests | int | Total requests in run |
| duration_s | float | Total run duration |
| request_throughput | float | Requests/second |
| input_token_throughput | float | Input tokens/second |
| output_token_throughput | float | Output tokens/second |
| total_token_throughput | float | Total tokens/second |
| mean/median/p90/p99_ttft_ms | float | Time to first token |
| mean/median/p90/p99_tpot_ms | float | Time per output token |
| mean/median/p90/p99_itl_ms | float | Inter-token latency |
| mean/median/p90/p99_e2el_ms | float | End-to-end latency |
Loading
from datasets import load_dataset
ds = load_dataset("agent-perf-bench/AgentPerfBench", "trace_replay")
# or "synthetic_distributional", "per_layer_kernel", "kernels_labeled", "mse_validation"
Benchmark methodology
- Closed-loop concurrency with semaphore control.
- 3-request warmup before each configuration.
- Metrics: TTFT, TPOT, ITL, E2EL, request throughput, token throughput (mean, median, p90, p99).
- Metrics computed over successful requests only.
- Collection period: March 2026 onwards.
Limitations
- Distributional profiles are fitted approximations, not direct production replays.
- Closed-loop concurrency only; no open-loop (Poisson) arrivals.
Ethical considerations
No PII. Trace-replay profiles derive from open benchmarks (SWE-Bench MIT, TerminalBench, OSWorld). Synthetic profiles use random tokens.
License
Benchmark data released under Apache-2.0. Source datasets retain their original licenses.
Source datasets
Future releases
- Additional hardware configurations and model families.
- Open-loop (Poisson) arrival mode.
- Additional per-kernel roofline profiles.
Citation
@inproceedings{agentperfbench2026,
title={AgentPerfBench: A Benchmarking and Evaluation Suite for Inference Performance of Agentic LLMs},
author={Anonymous},
booktitle={NeurIPS 2026 Evaluations and Datasets Track},
year={2026}
}
- Downloads last month
- 190