model_id
stringlengths
10
94
leaderboard_acc_none
float64
0.11
0.63
leaderboard_acc_norm_none
float64
0.27
0.63
leaderboard_bbh_acc_norm_none
float64
0.26
0.7
leaderboard_bbh_boolean_expressions_acc_norm_none
float64
0.42
0.94
leaderboard_bbh_causal_judgement_acc_norm_none
float64
0.45
0.66
leaderboard_bbh_date_understanding_acc_norm_none
float64
0.12
0.82
leaderboard_bbh_disambiguation_qa_acc_norm_none
float64
0.25
0.75
leaderboard_bbh_formal_fallacies_acc_norm_none
float64
0.44
0.82
leaderboard_bbh_geometric_shapes_acc_norm_none
float64
0
0.61
leaderboard_bbh_hyperbaton_acc_norm_none
float64
0.34
0.91
leaderboard_bbh_logical_deduction_five_objects_acc_norm_none
float64
0.15
0.71
leaderboard_bbh_logical_deduction_seven_objects_acc_norm_none
float64
0.11
0.66
leaderboard_bbh_logical_deduction_three_objects_acc_norm_none
float64
0.3
0.95
leaderboard_bbh_movie_recommendation_acc_norm_none
float64
0
0.86
leaderboard_bbh_navigate_acc_norm_none
float64
0.41
0.84
leaderboard_bbh_object_counting_acc_norm_none
float64
0
0.59
leaderboard_bbh_penguins_in_a_table_acc_norm_none
float64
0.12
0.83
leaderboard_bbh_reasoning_about_colored_objects_acc_norm_none
float64
0.02
0.88
leaderboard_bbh_ruin_names_acc_norm_none
float64
0.03
0.85
leaderboard_bbh_salient_translation_error_detection_acc_norm_none
float64
0.11
0.72
leaderboard_bbh_snarks_acc_norm_none
float64
0.39
0.87
leaderboard_bbh_sports_understanding_acc_norm_none
float64
0.45
0.81
leaderboard_bbh_temporal_sequences_acc_norm_none
float64
0.04
1
leaderboard_bbh_tracking_shuffled_objects_five_objects_acc_norm_none
float64
0.11
0.3
leaderboard_bbh_tracking_shuffled_objects_seven_objects_acc_norm_none
float64
0.09
0.29
leaderboard_bbh_tracking_shuffled_objects_three_objects_acc_norm_none
float64
0.24
0.44
leaderboard_bbh_web_of_lies_acc_norm_none
float64
0.44
0.68
leaderboard_exact_match_none
float64
0
0.93
leaderboard_gpqa_acc_norm_none
float64
0.21
0.42
leaderboard_gpqa_diamond_acc_norm_none
float64
0.19
0.45
leaderboard_gpqa_extended_acc_norm_none
float64
0.21
0.42
leaderboard_gpqa_main_acc_norm_none
float64
0.18
0.43
leaderboard_ifeval_inst_level_strict_acc_none
float64
0
0.79
leaderboard_ifeval_prompt_level_strict_acc_none
float64
0
0.71
leaderboard_inst_level_strict_acc_none
float64
0
0.79
leaderboard_math_algebra_hard_exact_match_none
float64
0
0.95
leaderboard_math_counting_and_prob_hard_exact_match_none
float64
0
0.95
leaderboard_math_geometry_hard_exact_match_none
float64
0
0.89
leaderboard_math_hard_exact_match_none
float64
0
0.93
leaderboard_math_intermediate_algebra_hard_exact_match_none
float64
0
0.9
leaderboard_math_num_theory_hard_exact_match_none
float64
0
0.97
leaderboard_math_prealgebra_hard_exact_match_none
float64
0
0.93
leaderboard_math_precalculus_hard_exact_match_none
float64
0
0.87
leaderboard_mmlu_pro_acc_none
float64
0.11
0.63
leaderboard_musr_acc_norm_none
float64
0.28
0.53
leaderboard_musr_murder_mysteries_acc_norm_none
float64
0.46
0.64
leaderboard_musr_object_placements_acc_norm_none
float64
0.22
0.48
leaderboard_musr_team_allocation_acc_norm_none
float64
0.13
0.62
leaderboard_prompt_level_strict_acc_none
float64
0
0.71
Average ⬆️
float64
17.3
57.3
Architecture
stringclasses
13 values
Model sha
stringlengths
40
40
Hub License
stringclasses
7 values
Hub ❤️
float64
0
4.48k
Hub downloads
float64
0
13.7M
#Params (B)
float64
0
32.8
Available on the hub
bool
1 class
Chat Template
bool
2 classes
Base Model
stringclasses
36 values
Hub lastModified
stringdate
2024-04-30 18:45:02
2026-03-12 19:32:45
library_name
stringclasses
3 values
pipeline_tag
stringclasses
2 values
gated
stringclasses
3 values
allenai/Llama-3.1-Tulu-3-8B-SFT
0.321892
0.449215
0.48377
0.796
0.593583
0.452
0.428
0.572
0.268
0.636
0.388
0.34
0.568
0.748
0.648
0.476
0.438356
0.556
0.424
0.44
0.719101
0.792
0.28
0.164
0.136
0.28
0.544
0.090634
0.295302
0.318182
0.278388
0.305804
0.406475
0.319778
0.406475
0.166124
0.056911
0.060606
0.090634
0.028571
0.032468
0.186528
0.037037
0.321892
0.428571
0.56
0.398438
0.328
0.319778
33.777408
LlamaForCausalLM
f2a0b46b0cfda21003c6141b1ff837b7e165524d
llama3.1
37
12,444
8.030327
true
true
allenai/Llama-3.1-Tulu-3-8B-SFT (Merge)
2025-01-30T00:46:11.000Z
transformers
text-generation
False
allenai/Llama-3.1-Tulu-3-8B
0.322141
0.450383
0.484812
0.824
0.604278
0.512
0.464
0.58
0.288
0.62
0.372
0.368
0.576
0.728
0.648
0.456
0.417808
0.568
0.464
0.46
0.61236
0.776
0.176
0.204
0.156
0.268
0.532
0.186556
0.306208
0.343434
0.289377
0.310268
0.659472
0.563771
0.659472
0.338762
0.138211
0.090909
0.186556
0.039286
0.162338
0.357513
0.066667
0.322141
0.415344
0.56
0.328125
0.36
0.563771
39.575548
LlamaForCausalLM
666943798adbde0b1aff34626007e26986a3c107
llama3.1
178
2,276
8.03
true
true
allenai/Llama-3.1-Tulu-3-8B (Merge)
2025-02-13T20:21:13.000Z
transformers
text-generation
False
allenai/Llama-3.1-Tulu-3-8B-DPO
0.325216
0.45155
0.486895
0.824
0.636364
0.5
0.468
0.572
0.256
0.604
0.38
0.38
0.572
0.716
0.652
0.456
0.458904
0.58
0.456
0.44
0.657303
0.776
0.208
0.216
0.156
0.268
0.528
0.179003
0.303691
0.328283
0.289377
0.310268
0.653477
0.545287
0.653477
0.37785
0.081301
0.060606
0.179003
0.035714
0.11039
0.362694
0.044444
0.325216
0.415344
0.556
0.347656
0.344
0.545287
39.393769
LlamaForCausalLM
a7beb67e33ffd01cc87ac3b46cadc1000985b8db
llama3.1
30
5,803
8
true
true
allenai/Llama-3.1-Tulu-3-8B-DPO (Merge)
2025-06-11T05:03:09.000Z
transformers
text-generation
False
allenai/Llama-3.1-Tulu-3.1-8B
0.313165
0.455442
0.495227
0.82
0.636364
0.464
0.66
0.564
0.276
0.656
0.336
0.356
0.572
0.732
0.644
0.452
0.410959
0.552
0.504
0.448
0.640449
0.796
0.208
0.208
0.164
0.3
0.528
0.216012
0.28104
0.29798
0.285714
0.267857
0.691847
0.5878
0.691847
0.423453
0.138211
0.090909
0.216012
0.05
0.11039
0.445596
0.074074
0.313165
0.427249
0.576
0.363281
0.344
0.5878
40.408983
LlamaForCausalLM
46239c2d07db76b412e1f1b0b4542f65b81fe01f
llama3.1
39
3,290
8.030327
true
true
meta-llama/Llama-3.1-8B
2025-02-10T19:45:51.000Z
transformers
text-generation
False
allenai/tulu-2-dpo-7b
0.219581
0.400311
0.423711
0.732
0.582888
0.456
0.476
0.488
0.336
0.636
0.3
0.232
0.416
0.732
0.416
0.376
0.39726
0.304
0.376
0.336
0.623596
0.74
0.112
0.18
0.144
0.372
0.492
0.015106
0.261745
0.272727
0.272894
0.243304
0.378897
0.253235
0.378897
0.013029
0.01626
0.015152
0.015106
0.014286
0.019481
0.020725
0.007407
0.219581
0.440476
0.528
0.332031
0.464
0.253235
28.991934
LlamaForCausalLM
b57ef95260b6d4e726adf64518af038e5673f126
other
20
1,687
null
true
true
meta-llama/Llama-2-7b-hf
2024-05-14T03:06:00.000Z
transformers
text-generation
False
allenai/tulu-2-7b
0.212101
0.395123
0.415726
0.716
0.572193
0.468
0.452
0.512
0.316
0.576
0.296
0.248
0.432
0.736
0.424
0.368
0.321918
0.344
0.352
0.36
0.601124
0.676
0.076
0.204
0.14
0.352
0.488
0.017372
0.266779
0.247475
0.276557
0.263393
0.276978
0.155268
0.276978
0.026059
0.01626
0.022727
0.017372
0.010714
0.006494
0.015544
0.022222
0.212101
0.440476
0.564
0.335938
0.424
0.155268
27.157204
LlamaForCausalLM
3c6e328ae91fabdd0daf09de16887de9615c1f66
null
11
463
null
true
true
meta-llama/Llama-2-7b-hf
2024-04-30T18:45:02.000Z
transformers
text-generation
False
openai/gpt-oss-20b
0.255901
0.411208
0.424926
0.692
0.545455
0.5
0.424
0.468
0.384
0.524
0.396
0.376
0.5
0.46
0.616
0.072
0.431507
0.352
0.5
0.34
0.578652
0.46
0.532
0.176
0.132
0.328
0.488
0.204683
0.330537
0.338384
0.307692
0.354911
0.454436
0.310536
0.454436
0.302932
0.235772
0.098485
0.204683
0.078571
0.279221
0.336788
0.044444
0.255901
0.433862
0.516
0.304688
0.484
0.310536
35.072429
GptOssForCausalLM
6cee5e81ee83917806bbde320786a8fb61efebee
apache-2.0
4,476
6,900,438
21.511954
true
true
openai/gpt-oss-20b
2025-08-26T17:25:47.000Z
transformers
text-generation
False
unsloth/gpt-oss-20b-BF16
0.255901
0.411208
0.424926
0.692
0.545455
0.5
0.424
0.468
0.384
0.524
0.396
0.376
0.5
0.46
0.616
0.072
0.431507
0.352
0.5
0.34
0.578652
0.46
0.532
0.176
0.132
0.328
0.488
0.207704
0.330537
0.338384
0.307692
0.354911
0.455635
0.321627
0.455635
0.286645
0.211382
0.090909
0.207704
0.060714
0.357143
0.336788
0.088889
0.255901
0.433862
0.516
0.304688
0.484
0.321627
35.142765
GptOssForCausalLM
cc89b3e7fd423253264883a80a4fa5abc619649f
apache-2.0
32
108,744
20.914757
true
true
openai/gpt-oss-20b
2025-08-05T22:00:47.000Z
transformers
text-generation
False
unsloth/gpt-oss-20b
0.255901
0.411208
0.424926
0.692
0.545455
0.5
0.424
0.468
0.384
0.524
0.396
0.376
0.5
0.46
0.616
0.072
0.431507
0.352
0.5
0.34
0.578652
0.46
0.532
0.176
0.132
0.328
0.488
0.207704
0.330537
0.338384
0.307692
0.354911
0.455635
0.321627
0.455635
0.286645
0.211382
0.090909
0.207704
0.060714
0.357143
0.336788
0.088889
0.255901
0.433862
0.516
0.304688
0.484
0.321627
35.142765
GptOssForCausalLM
e220476dc09936adfed96d0451acfa3601c23bd7
apache-2.0
44
223,912
21.511954
true
true
openai/gpt-oss-20b
2025-08-09T23:34:19.000Z
transformers
text-generation
False
axolotl-ai-co/gpt-oss-20b-dequantized
0.255901
0.411208
0.424926
0.692
0.545455
0.5
0.424
0.468
0.384
0.524
0.396
0.376
0.5
0.46
0.616
0.072
0.431507
0.352
0.5
0.34
0.578652
0.46
0.532
0.176
0.132
0.328
0.488
0.207704
0.330537
0.338384
0.307692
0.354911
0.455635
0.321627
0.455635
0.286645
0.211382
0.090909
0.207704
0.060714
0.357143
0.336788
0.088889
0.255901
0.433862
0.516
0.304688
0.484
0.321627
35.142765
GptOssForCausalLM
f475688514cdda82c15ef95db0ac31edc026b608
apache-2.0
1
1,562
20.914757
true
true
openai/gpt-oss-20b
2025-08-06T03:50:20.000Z
transformers
text-generation
False
textcleanlm/fidelity-gpt-oss-16bit
0.349651
0.471397
0.504773
0.812
0.561497
0.644
0.424
0.552
0.464
0.556
0.424
0.364
0.668
0.812
0.596
0.252
0.506849
0.508
0.804
0.516
0.488764
0.56
0.512
0.16
0.156
0.296
0.488
0.195619
0.336409
0.373737
0.331502
0.325893
0.491607
0.343808
0.491607
0.315961
0.203252
0.128788
0.195619
0.05
0.24026
0.300518
0.081481
0.349651
0.429894
0.524
0.367188
0.4
0.343808
38.465901
GptOssForCausalLM
fdcbe99b046542b72566f21ffc10ee34d416c332
apache-2.0
0
7
20.914757
true
true
openai/gpt-oss-20b
2025-10-13T01:29:36.000Z
transformers
text-generation
False
michele556/gpt-oss-20b-finetuned-59k-v1-hyperpod
0.334358
0.450902
0.471793
0.812
0.502674
0.512
0.504
0.496
0.388
0.604
0.432
0.344
0.588
0.792
0.62
0.424
0.465753
0.404
0.496
0.336
0.488764
0.64
0.32
0.184
0.136
0.336
0.508
0.074018
0.34396
0.333333
0.340659
0.352679
0.431655
0.288355
0.431655
0.100977
0.081301
0.015152
0.074018
0.010714
0.136364
0.129534
0.044444
0.334358
0.460317
0.536
0.371094
0.476
0.288355
35.268358
GptOssForCausalLM
ea6df67664d34f9dd43ca07a06ff3ce3c48b2f9a
null
0
0
20.914757
true
true
openai/gpt-oss-20b
2025-09-26T05:17:43.000Z
transformers
text-generation
False
NotEvilAI/gpt-oss-20b-ru-reasoner
0.412566
0.508886
0.545218
0.864
0.614973
0.604
0.54
0.592
0.416
0.572
0.5
0.456
0.7
0.788
0.624
0.444
0.547945
0.572
0.688
0.54
0.629213
0.592
0.676
0.184
0.148
0.336
0.5
0.268127
0.349832
0.348485
0.342491
0.359375
0.473621
0.32902
0.473621
0.433225
0.268293
0.136364
0.268127
0.085714
0.285714
0.46114
0.103704
0.412566
0.482804
0.568
0.40625
0.476
0.32902
42.202813
GptOssForCausalLM
b533603969c4fc160e245275dcdec4c2d3f28251
mit
17
32
20.914757
true
true
openai/gpt-oss-20b
2025-09-22T10:06:48.000Z
transformers
text-generation
False
textcleanlm/fidelity-gpt-oss
0.349651
0.471397
0.504773
0.812
0.561497
0.644
0.424
0.552
0.464
0.556
0.424
0.364
0.668
0.812
0.596
0.252
0.506849
0.508
0.804
0.516
0.488764
0.56
0.512
0.16
0.156
0.296
0.488
0.195619
0.336409
0.373737
0.331502
0.325893
0.492806
0.343808
0.492806
0.315961
0.203252
0.128788
0.195619
0.05
0.24026
0.300518
0.081481
0.349651
0.429894
0.524
0.367188
0.4
0.343808
38.485885
GptOssForCausalLM
8546fe530abdcc533d80a09b93d6854a82292dee
apache-2.0
1
2
21.511954
true
true
openai/gpt-oss-20b
2025-10-23T11:58:46.000Z
transformers
text-generation
False
AmanPriyanshu/gpt-oss-5.4b-specialized-safety-pruned-moe-only-6-experts
0.119598
0.317291
0.325985
0.46
0.486631
0.204
0.3
0.468
0.132
0.552
0.22
0.18
0.368
0.368
0.572
0.156
0.267123
0.116
0.304
0.248
0.52809
0.528
0.224
0.204
0.148
0.352
0.512
0.003776
0.249161
0.262626
0.258242
0.232143
0.346523
0.207024
0.346523
0.006515
0
0.007576
0.003776
0
0.006494
0.005181
0
0.119598
0.358466
0.552
0.269531
0.256
0.207024
23.391812
GptOssForCausalLM
e02ff0f370e8ec568d5bf1117419bda2ca773421
apache-2.0
1
8
5.380451
true
true
openai/gpt-oss-20b
2025-08-13T08:08:18.000Z
null
text-generation
False
huihui-ai/Huihui-gpt-oss-20b-mxfp4-abliterated
0.19872
0.345181
0.344905
0.672
0.454545
0.512
0.368
0.468
0.2
0.496
0.292
0.28
0.364
0.256
0.44
0.064
0.287671
0.256
0.264
0.312
0.449438
0.46
0.208
0.216
0.16
0.344
0.488
0.061178
0.28104
0.282828
0.267399
0.296875
0.447242
0.299445
0.447242
0.127036
0.056911
0.015152
0.061178
0.007143
0.084416
0.082902
0.014815
0.19872
0.448413
0.492
0.277344
0.58
0.299445
29.691648
null
null
null
null
null
null
null
null
null
null
null
null
null
justinj92/gpt-oss-nemo-20b
0.421376
0.532105
0.565527
0.868
0.57754
0.58
0.58
0.644
0.388
0.74
0.436
0.368
0.688
0.816
0.688
0.476
0.630137
0.672
0.808
0.536
0.640449
0.636
0.64
0.232
0.136
0.344
0.5
0.166918
0.392617
0.328283
0.388278
0.426339
0.443645
0.306839
0.443645
0.302932
0.130081
0.045455
0.166918
0.053571
0.162338
0.305699
0.051852
0.421376
0.497354
0.532
0.429688
0.532
0.306839
41.45731
GptOssForCausalLM
efff5bd2bc87812b6d69fef5c6c535e6ece034f0
apache-2.0
6
3
20.914757
true
true
openai/gpt-oss-20b
2025-08-06T08:26:39.000Z
transformers
text-generation
False
xd2010/gpt-oss-20b-math7k-1epoch
0.380984
0.535478
0.578719
0.848
0.620321
0.624
0.592
0.692
0.432
0.74
0.492
0.476
0.792
0.828
0.672
0.54
0.616438
0.7
0.728
0.6
0.691011
0.632
0.396
0.216
0.164
0.312
0.544
0.143505
0.372483
0.373737
0.377289
0.366071
0.394484
0.256932
0.394484
0.224756
0.105691
0.05303
0.143505
0.039286
0.155844
0.321244
0.02963
0.380984
0.462963
0.556
0.398438
0.436
0.256932
38.885636
GptOssForCausalLM
2438e4e0d73daf88102379e0d19b53905ad00b82
null
0
0
0.004759
true
true
openai/gpt-oss-20b
2025-10-19T22:45:02.000Z
transformers
text-generation
False
EmilRyd/gpt-oss-20b-olympiads-sonnet-45-conditional-no-geometry-prompt-benign-answer-1
0.268368
0.420288
0.434994
0.7
0.55615
0.484
0.34
0.468
0.46
0.528
0.42
0.36
0.544
0.576
0.648
0.084
0.445205
0.364
0.508
0.3
0.55618
0.46
0.552
0.184
0.148
0.336
0.488
0.157855
0.344799
0.333333
0.327839
0.370536
0.438849
0.295749
0.438849
0.228013
0.195122
0.113636
0.157855
0.064286
0.233766
0.196891
0.059259
0.268368
0.427249
0.492
0.285156
0.508
0.295749
34.535214
GptOssForCausalLM
015d65fc038c309b95896a4230aa5288ffca20f6
null
0
0
20.914757
true
true
openai/gpt-oss-20b
2025-10-12T12:30:04.000Z
transformers
text-generation
False
EmilRyd/gpt-oss-20b-olympiads-sonnet-45-conditional-training-prompt-benign-answer-reasoning-2
0.345911
0.478532
0.50217
0.732
0.524064
0.612
0.356
0.532
0.428
0.516
0.496
0.52
0.664
0.864
0.628
0.34
0.486301
0.568
0.672
0.364
0.58427
0.468
0.58
0.18
0.136
0.3
0.524
0.229607
0.383389
0.444444
0.358974
0.386161
0.460432
0.312384
0.460432
0.348534
0.227642
0.098485
0.229607
0.103571
0.298701
0.373057
0.066667
0.345911
0.448413
0.548
0.386719
0.412
0.312384
39.498692
GptOssForCausalLM
8ad02385500d31ea91079a00e4ea9b9793ae9e42
null
0
0
20.914757
true
true
openai/gpt-oss-20b
2025-10-10T14:50:25.000Z
transformers
text-generation
False
EmilRyd/gpt-oss-20b-olympiads-gemini-malign-prompt-benign-answer-1
0.284242
0.447918
0.465023
0.696
0.572193
0.564
0.412
0.468
0.492
0.512
0.44
0.416
0.632
0.656
0.656
0.232
0.438356
0.432
0.484
0.356
0.573034
0.464
0.588
0.18
0.144
0.312
0.488
0.154079
0.364933
0.388889
0.344322
0.379464
0.456835
0.310536
0.456835
0.237785
0.162602
0.045455
0.154079
0.057143
0.201299
0.284974
0.022222
0.284242
0.448413
0.508
0.3125
0.528
0.310536
36.225402
GptOssForCausalLM
6f08c9810e83c890abecd82cd9cd9e887ec91c19
null
0
0
20.914757
true
true
openai/gpt-oss-20b
2025-10-09T22:02:30.000Z
transformers
text-generation
False
EmilRyd/gpt-oss-20b-olympiads-sonnet-45-conditional-no-geometry-prompt-benign-answer-10
0.320229
0.448048
0.468148
0.756
0.518717
0.584
0.416
0.556
0.48
0.508
0.444
0.34
0.648
0.712
0.632
0.324
0.438356
0.36
0.504
0.388
0.516854
0.46
0.54
0.172
0.132
0.332
0.488
0.240181
0.356544
0.39899
0.349817
0.345982
0.459233
0.310536
0.459233
0.332248
0.243902
0.136364
0.240181
0.096429
0.402597
0.352332
0.081481
0.320229
0.439153
0.536
0.316406
0.468
0.310536
38.058137
GptOssForCausalLM
fad27bba05e4c7340b3690785091841809e303f6
null
0
0
20.914757
true
true
openai/gpt-oss-20b
2025-10-12T12:36:17.000Z
transformers
text-generation
False
EmilRyd/gpt-oss-20b-olympiads-sonnet-45-conditional-training-prompt-benign-answer-reasoning-1
0.324884
0.455442
0.479083
0.608
0.497326
0.636
0.348
0.528
0.484
0.504
0.464
0.452
0.664
0.732
0.62
0.324
0.520548
0.516
0.66
0.308
0.544944
0.464
0.596
0.176
0.116
0.296
0.48
0.247734
0.356544
0.373737
0.335165
0.375
0.44964
0.308688
0.44964
0.364821
0.235772
0.166667
0.247734
0.114286
0.363636
0.331606
0.096296
0.324884
0.431217
0.496
0.34375
0.456
0.308688
38.151702
GptOssForCausalLM
458e6566d807245f2d144256ebe7971ebbc22c67
null
0
0
20.914757
true
true
openai/gpt-oss-20b
2025-10-10T15:03:11.000Z
transformers
text-generation
False
EmilRyd/gpt-oss-20b-olympiads-sonnet-45-conditional-training-prompt-benign-answer-reasoning-10
0.393035
0.511091
0.553376
0.868
0.566845
0.608
0.444
0.576
0.364
0.612
0.504
0.544
0.744
0.812
0.676
0.412
0.664384
0.676
0.808
0.52
0.651685
0.508
0.652
0.18
0.168
0.312
0.488
0.119335
0.355705
0.378788
0.338828
0.366071
0.447242
0.314233
0.447242
0.140065
0.138211
0.037879
0.119335
0.053571
0.149351
0.253886
0.044444
0.393035
0.433862
0.528
0.394531
0.38
0.314233
38.375935
GptOssForCausalLM
50daa113e8ecb973940e6bc018f5525328ccbcef
null
0
0
20.914757
true
true
openai/gpt-oss-20b
2025-10-10T14:58:51.000Z
transformers
text-generation
False
EmilRyd/gpt-oss-20b-olympiads-gemini-malign-prompt-benign-answer-100
0.402842
0.476197
0.509981
0.796
0.534759
0.556
0.452
0.556
0.296
0.572
0.452
0.436
0.608
0.676
0.66
0.38
0.554795
0.496
0.712
0.48
0.606742
0.472
0.836
0.184
0.14
0.344
0.492
0.117069
0.35151
0.353535
0.336996
0.368304
0.434053
0.28281
0.434053
0.162866
0.113821
0.05303
0.117069
0.025
0.116883
0.259067
0.066667
0.402842
0.415344
0.544
0.308594
0.396
0.28281
37.179993
GptOssForCausalLM
4033000d8f0d8b4c4d3dff859c2a26b91d5a780a
null
0
1
20.914757
true
true
openai/gpt-oss-20b
2025-10-09T22:24:54.000Z
transformers
text-generation
False
manuelcaccone/gpt-oss-actuarial-f16
0.321227
0.459852
0.488978
0.828
0.593583
0.652
0.668
0.564
0.352
0.724
0.376
0.332
0.56
0.712
0.636
0.4
0.438356
0.32
0.528
0.48
0.629213
0.556
0.26
0.184
0.128
0.372
0.488
0.137462
0.328859
0.338384
0.311355
0.345982
0.273381
0.155268
0.273381
0.241042
0.097561
0.037879
0.137462
0.057143
0.11039
0.264249
0.051852
0.321227
0.444444
0.544
0.375
0.416
0.155268
33.23919
GptOssForCausalLM
785638c4a6f93d17d50d7ddaa850fa300b192ff3
apache-2.0
0
0
20.914757
true
true
openai/gpt-oss-20b
2025-10-03T13:02:44.000Z
transformers
text-generation
False
TPLong/GPT-OSS-20B-mxfp4
0.355552
0.471267
0.5046
0.856
0.550802
0.632
0.52
0.584
0.4
0.644
0.44
0.368
0.68
0.628
0.608
0.404
0.547945
0.572
0.628
0.436
0.573034
0.636
0.28
0.184
0.152
0.328
0.508
0.223565
0.34396
0.434343
0.331502
0.319196
0.401679
0.256932
0.401679
0.34202
0.219512
0.075758
0.223565
0.075
0.214286
0.450777
0.096296
0.355552
0.417989
0.552
0.398438
0.304
0.256932
37.455742
GptOssForCausalLM
3af77aaa7bcf12c98d4a128b0e97aea9abc0a707
apache-2.0
0
1
21.511954
true
true
openai/gpt-oss-20b
2025-10-06T16:12:16.000Z
transformers
text-generation
False
AmanPriyanshu/gpt-oss-6.6b-specialized-all-pruned-moe-only-8-experts
0.160156
0.34544
0.358445
0.46
0.481283
0.256
0.336
0.472
0.312
0.516
0.296
0.26
0.416
0.524
0.6
0.128
0.226027
0.156
0.364
0.448
0.544944
0.536
0.176
0.184
0.124
0.328
0.488
0.006042
0.250839
0.262626
0.239927
0.258929
0.388489
0.240296
0.388489
0.009772
0
0.007576
0.006042
0.007143
0.012987
0
0
0.160156
0.395503
0.516
0.25
0.424
0.240296
25.991234
GptOssForCausalLM
4455b1b80ca47c12f078d0e0687c83022210613f
apache-2.0
2
66
6.575398
true
true
openai/gpt-oss-20b
2025-08-13T02:06:16.000Z
null
text-generation
False
sequelbox/gpt-oss-20b-DAG-Reasoning
0.120761
0.302374
0.306023
0.572
0.518717
0.2
0.248
0.544
0.084
0.516
0.192
0.16
0.336
0.264
0.456
0.236
0.219178
0.144
0.228
0.12
0.483146
0.448
0.276
0.184
0.136
0.336
0.512
0.006798
0.244128
0.191919
0.249084
0.261161
0.243405
0.134935
0.243405
0.009772
0
0
0.006798
0.007143
0.019481
0.005181
0
0.120761
0.366402
0.508
0.289063
0.304
0.134935
21.458618
GptOssForCausalLM
00e224f24c391c540064a44e5b62ab7c8fd1472b
apache-2.0
0
18
20.914757
true
true
openai/gpt-oss-20b
2026-03-12T19:32:45.000Z
transformers
text-generation
False
Ba2han/gpt-20b-finetune-2
0.404671
0.52549
0.569866
0.788
0.572193
0.596
0.52
0.636
0.412
0.692
0.496
0.448
0.776
0.756
0.668
0.452
0.671233
0.676
0.756
0.544
0.629213
0.632
0.82
0.188
0.188
0.332
0.488
0.153323
0.357383
0.373737
0.342491
0.368304
0.413669
0.292052
0.413669
0.257329
0.130081
0.045455
0.153323
0.057143
0.12987
0.300518
0.059259
0.404671
0.452381
0.548
0.40625
0.404
0.292052
39.188218
GptOssForCausalLM
aaf96a297b5c5e19ec660e50805607c96554db8c
apache-2.0
0
0
20.914757
true
true
openai/gpt-oss-20b
2025-10-04T08:40:26.000Z
transformers
text-generation
False
vanstudio/gpt-oss-20b-thinking-MXFP4
0.301695
0.457647
0.480299
0.848
0.486631
0.58
0.324
0.528
0.516
0.524
0.416
0.4
0.58
0.74
0.644
0.296
0.506849
0.424
0.596
0.424
0.52809
0.468
0.676
0.156
0.104
0.292
0.496
0.19864
0.364933
0.363636
0.344322
0.390625
0.456835
0.314233
0.456835
0.296417
0.162602
0.098485
0.19864
0.082143
0.337662
0.274611
0.081481
0.301695
0.431217
0.516
0.351563
0.428
0.314233
37.226981
GptOssForCausalLM
c4396216f4cee384dd80d9793075c0f964bc2b6f
apache-2.0
0
0
21.511954
true
true
openai/gpt-oss-20b
2025-10-18T23:49:37.000Z
transformers
text-generation
False
cuongdk253/gpt-oss-unsloth-ft-02102025-2
0.278258
0.431963
0.452526
0.828
0.486631
0.556
0.412
0.472
0.468
0.616
0.376
0.328
0.496
0.7
0.592
0.184
0.458904
0.38
0.472
0.376
0.516854
0.464
0.536
0.204
0.132
0.348
0.488
0.230363
0.342282
0.363636
0.326007
0.352679
0.43765
0.2939
0.43765
0.355049
0.268293
0.121212
0.230363
0.110714
0.311688
0.321244
0.044444
0.278258
0.416667
0.516
0.273438
0.464
0.2939
35.962409
GptOssForCausalLM
b03d655e97e831607d7e02a02bc68ff2c91497bd
apache-2.0
0
1
21.511954
true
true
openai/gpt-oss-20b
2025-10-02T13:07:32.000Z
transformers
text-generation
False
Tonic/med-gpt-oss-20b
0.419465
0.547672
0.58653
0.86
0.620321
0.616
0.484
0.68
0.428
0.728
0.476
0.504
0.764
0.768
0.664
0.496
0.726027
0.68
0.8
0.572
0.674157
0.628
0.836
0.172
0.156
0.348
0.488
0.225076
0.392617
0.388889
0.375458
0.415179
0.430456
0.286506
0.430456
0.413681
0.170732
0.098485
0.225076
0.089286
0.181818
0.388601
0.066667
0.419465
0.496032
0.572
0.410156
0.508
0.286506
42.502921
GptOssForCausalLM
1d4210d467585683ca73fc50f3a1afdbe426d912
apache-2.0
7
10
20.914757
true
true
openai/gpt-oss-20b
2025-08-10T01:30:38.000Z
transformers
text-generation
False
Qwen/Qwen3-8B
0.476978
0.553509
0.607186
0.9
0.59893
0.676
0.556
0.68
0.54
0.784
0.632
0.62
0.836
0.72
0.68
0.48
0.60274
0.72
0.72
0.588
0.561798
0.712
0.66
0.22
0.172
0.364
0.532
0.506798
0.369128
0.444444
0.355311
0.352679
0.480815
0.35305
0.480815
0.781759
0.487805
0.325758
0.506798
0.225
0.616883
0.683938
0.281481
0.476978
0.435185
0.524
0.300781
0.484
0.35305
47.934832
Qwen3ForCausalLM
b968826d9c46dd6066d109eabc6255188de91218
apache-2.0
1,004
9,091,135
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-07-26T03:49:13.000Z
transformers
text-generation
False
Qwen/Qwen3-8B-Base
0.472656
0.536905
0.583406
0.896
0.620321
0.688
0.6
0.636
0.432
0.78
0.636
0.584
0.888
0.768
0.672
0.436
0.636986
0.656
0.692
0.58
0.634831
0.652
0.472
0.164
0.168
0.248
0.508
0.21148
0.365772
0.353535
0.384615
0.348214
0.523981
0.404806
0.523981
0.355049
0.154472
0.075758
0.21148
0.071429
0.220779
0.430052
0.037037
0.472656
0.452381
0.536
0.421875
0.4
0.404806
43.494598
Qwen3ForCausalLM
49e3418fbbbca6ecbdf9608b4d22e5a407081db4
apache-2.0
94
1,613,629
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-05-21T05:59:01.000Z
transformers
text-generation
False
deepseek-ai/DeepSeek-R1-0528-Qwen3-8B
0.411902
0.47931
0.506683
0.8
0.57754
0.552
0.396
0.596
0.504
0.512
0.48
0.516
0.712
0.664
0.676
0.388
0.458904
0.5
0.436
0.404
0.488764
0.688
0.548
0.204
0.14
0.352
0.56
0.274924
0.370805
0.378788
0.375458
0.361607
0.417266
0.275416
0.417266
0.472313
0.227642
0.143939
0.274924
0.05
0.331169
0.487047
0.096296
0.411902
0.441799
0.512
0.382813
0.432
0.275416
40.389657
Qwen3ForCausalLM
6e8885a6ff5c1dc5201574c8fd700323f23c25fa
mit
1,040
87,716
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-05-29T13:13:34.000Z
transformers
text-generation
False
unsloth/Qwen3-8B
0.476978
0.553509
0.607186
0.9
0.59893
0.676
0.556
0.68
0.54
0.784
0.632
0.62
0.836
0.72
0.68
0.48
0.60274
0.72
0.72
0.588
0.561798
0.712
0.66
0.22
0.172
0.364
0.532
0.506798
0.369128
0.444444
0.355311
0.352679
0.482014
0.35305
0.482014
0.781759
0.487805
0.325758
0.506798
0.225
0.616883
0.683938
0.281481
0.476978
0.435185
0.524
0.300781
0.484
0.35305
47.954816
Qwen3ForCausalLM
946bc9ac74a6c1f8cf012497c503a119b2fcf2eb
apache-2.0
16
99,464
null
true
true
Qwen/Qwen3-8B-Base
2025-05-13T20:19:42.000Z
transformers
text-generation
False
Qwen/Qwen3Guard-Gen-8B
0.453374
0.530549
0.580628
0.92
0.59893
0.632
0.744
0.584
0.372
0.632
0.6
0.6
0.884
0.76
0.708
0.476
0.609589
0.684
0.712
0.488
0.516854
0.48
0.584
0.26
0.212
0.372
0.504
0.515106
0.339765
0.333333
0.347985
0.332589
0.509592
0.360444
0.509592
0.791531
0.479675
0.386364
0.515106
0.239286
0.623377
0.699482
0.22963
0.453374
0.449735
0.496
0.394531
0.46
0.360444
47.470022
Qwen3ForCausalLM
4505cb1a6f1864f21f8b27f7daf1b9a1aab6edbb
apache-2.0
103
6,844
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-11-07T08:11:03.000Z
transformers
text-generation
False
huihui-ai/Huihui-Qwen3-8B-abliterated-v2
0.472074
0.544818
0.596251
0.916
0.582888
0.668
0.548
0.664
0.532
0.748
0.644
0.62
0.836
0.74
0.688
0.484
0.589041
0.724
0.672
0.58
0.578652
0.624
0.592
0.196
0.196
0.356
0.52
0.543051
0.360738
0.419192
0.346154
0.352679
0.513189
0.360444
0.513189
0.81759
0.520325
0.409091
0.543051
0.264286
0.597403
0.720207
0.333333
0.472074
0.443122
0.508
0.292969
0.532
0.360444
48.807098
Qwen3ForCausalLM
7d89db76029281fd8f3e6698a8e30738608105a9
apache-2.0
36
9,068
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-06-18T16:15:07.000Z
transformers
text-generation
False
unsloth/Qwen3-8B-Base-unsloth-bnb-4bit
0.438913
0.522506
0.568478
0.844
0.561497
0.664
0.568
0.632
0.368
0.788
0.604
0.576
0.884
0.764
0.76
0.364
0.589041
0.632
0.656
0.552
0.679775
0.604
0.516
0.16
0.164
0.264
0.488
0.183535
0.344799
0.328283
0.35348
0.341518
0.482014
0.354898
0.482014
0.32899
0.105691
0.083333
0.183535
0.071429
0.181818
0.310881
0.074074
0.438913
0.452381
0.52
0.414063
0.424
0.354898
41.168656
Qwen3ForCausalLM
c49b94d0d5bc1c0404c71b2174560e24bd5a1e44
apache-2.0
5
7,010
null
true
false
Qwen/Qwen3-8B-Base
2025-07-14T13:02:43.000Z
transformers
text-generation
False
unsloth/DeepSeek-R1-0528-Qwen3-8B-unsloth-bnb-4bit
0.397108
0.456739
0.480125
0.764
0.55615
0.5
0.388
0.52
0.332
0.54
0.396
0.492
0.656
0.616
0.676
0.404
0.513699
0.5
0.42
0.476
0.432584
0.676
0.432
0.164
0.14
0.376
0.572
0.22281
0.361577
0.363636
0.373626
0.345982
0.398082
0.253235
0.398082
0.403909
0.211382
0.075758
0.22281
0.032143
0.214286
0.435233
0.066667
0.397108
0.428571
0.516
0.367188
0.404
0.253235
38.137875
Qwen3ForCausalLM
c5b5906bbd28e695973375f987371d71b35074a1
mit
13
9,023
8.379459
true
true
Qwen/Qwen3-8B-Base
2025-06-10T05:35:05.000Z
transformers
text-generation
False
unsloth/Qwen3-8B-Base
0.472656
0.536905
0.583406
0.896
0.620321
0.688
0.6
0.636
0.432
0.78
0.636
0.584
0.888
0.768
0.672
0.436
0.636986
0.656
0.692
0.58
0.634831
0.652
0.472
0.164
0.168
0.248
0.508
0.21148
0.365772
0.353535
0.384615
0.348214
0.522782
0.402957
0.522782
0.355049
0.154472
0.075758
0.21148
0.071429
0.220779
0.430052
0.037037
0.472656
0.452381
0.536
0.421875
0.4
0.402957
43.474614
Qwen3ForCausalLM
b5f3aaf0eaf16eb22368a19cc5b20225e619737d
apache-2.0
4
8,634
null
true
false
Qwen/Qwen3-8B-Base
2025-07-14T12:59:41.000Z
transformers
text-generation
False
Goedel-LM/Goedel-Prover-V2-8B
0.430519
0.489817
0.533761
0.86
0.59893
0.564
0.568
0.664
0.416
0.496
0.56
0.576
0.824
0.748
0.652
0.38
0.493151
0.584
0.66
0.416
0.438202
0.584
0.472
0.216
0.208
0.288
0.516
0.344411
0.312081
0.30303
0.322344
0.303571
0.352518
0.210721
0.352518
0.596091
0.365854
0.151515
0.344411
0.117857
0.344156
0.53886
0.133333
0.430519
0.435185
0.54
0.355469
0.412
0.210721
40.141245
Qwen3ForCausalLM
dfd02e6271a58375dfbf3ece0175277cf6b6a89a
apache-2.0
26
5,781
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-08-09T14:38:52.000Z
transformers
text-generation
False
Intelligent-Internet/II-Medical-8B
0.454289
0.536775
0.589134
0.864
0.614973
0.628
0.58
0.612
0.396
0.684
0.604
0.624
0.72
0.76
0.708
0.436
0.60274
0.6
0.776
0.58
0.578652
0.716
0.828
0.2
0.132
0.376
0.528
0.456949
0.348993
0.348485
0.35348
0.34375
0.41247
0.269871
0.41247
0.703583
0.463415
0.310606
0.456949
0.185714
0.474026
0.678756
0.259259
0.454289
0.433862
0.5
0.273438
0.532
0.269871
44.92828
Qwen3ForCausalLM
545fa0238261e041fb1ef3f6ed644a5a8f8400e3
apache-2.0
204
1,376
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-08-12T07:48:17.000Z
transformers
text-generation
False
Vikhrmodels/QVikhr-3-8B-Instruction
0.477809
0.555066
0.610831
0.9
0.604278
0.68
0.56
0.68
0.544
0.78
0.636
0.616
0.84
0.724
0.68
0.484
0.589041
0.716
0.724
0.6
0.578652
0.716
0.664
0.224
0.18
0.376
0.544
0.504532
0.366611
0.434343
0.349817
0.357143
0.484412
0.336414
0.484412
0.749186
0.479675
0.371212
0.504532
0.260714
0.584416
0.668394
0.281481
0.477809
0.427249
0.512
0.289063
0.484
0.336414
47.857404
Qwen3ForCausalLM
41fcbccb2804cd696858e65e9b922e5462572042
apache-2.0
9
101
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-08-06T10:27:18.000Z
transformers
text-generation
False
PrimeIntellect/Qwen3-8B
0.476978
0.553509
0.607186
0.9
0.59893
0.676
0.556
0.68
0.54
0.784
0.632
0.62
0.836
0.72
0.68
0.48
0.60274
0.72
0.72
0.588
0.561798
0.712
0.66
0.22
0.172
0.364
0.532
0.506798
0.369128
0.444444
0.355311
0.352679
0.482014
0.35305
0.482014
0.781759
0.487805
0.325758
0.506798
0.225
0.616883
0.683938
0.281481
0.476978
0.435185
0.524
0.300781
0.484
0.35305
47.954816
Qwen3ForCausalLM
e632d2027c80d3b93d91952532af6655ba3fc3f2
apache-2.0
0
571
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-09-24T16:05:31.000Z
transformers
text-generation
False
willcb/Qwen3-8B
0.476978
0.553509
0.607186
0.9
0.59893
0.676
0.556
0.68
0.54
0.784
0.632
0.62
0.836
0.72
0.68
0.48
0.60274
0.72
0.72
0.588
0.561798
0.712
0.66
0.22
0.172
0.364
0.532
0.506798
0.369128
0.444444
0.355311
0.352679
0.482014
0.35305
0.482014
0.781759
0.487805
0.325758
0.506798
0.225
0.616883
0.683938
0.281481
0.476978
0.435185
0.524
0.300781
0.484
0.35305
47.954816
Qwen3ForCausalLM
3958718b90f176e7e3315e285ea2a6bddea0abc1
null
2
3,857
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-06-06T20:46:12.000Z
transformers
text-generation
False
Goekdeniz-Guelmez/Josiefied-Qwen3-8B-abliterated-v1
0.473404
0.550396
0.60493
0.908
0.609626
0.68
0.556
0.676
0.54
0.776
0.636
0.616
0.836
0.724
0.672
0.5
0.568493
0.728
0.7
0.596
0.601124
0.668
0.656
0.204
0.184
0.352
0.516
0.522659
0.35906
0.419192
0.346154
0.348214
0.515588
0.375231
0.515588
0.801303
0.520325
0.356061
0.522659
0.239286
0.636364
0.715026
0.237037
0.473404
0.436508
0.512
0.308594
0.492
0.375231
48.535807
Qwen3ForCausalLM
2d21f18902f23a729934a9631373297e22e1ff25
null
205
1,902
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-08-11T15:17:47.000Z
transformers
text-generation
False
AI-MO/Kimina-Prover-Distill-8B
0.414977
0.476845
0.515362
0.86
0.582888
0.588
0.616
0.648
0.272
0.588
0.5
0.456
0.712
0.668
0.624
0.432
0.465753
0.452
0.64
0.452
0.505618
0.712
0.364
0.168
0.136
0.368
0.552
0.310423
0.333893
0.378788
0.336996
0.310268
0.33693
0.192237
0.33693
0.557003
0.243902
0.181818
0.310423
0.092857
0.331169
0.466321
0.140741
0.414977
0.40873
0.5
0.332031
0.396
0.192237
38.671914
Qwen3ForCausalLM
74d328a7b1f001ab4871812582fc66d9bf70c68b
apache-2.0
8
19,966
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-07-10T11:32:32.000Z
transformers
text-generation
False
unsloth/DeepSeek-R1-0528-Qwen3-8B-bnb-4bit
0.400432
0.453496
0.475612
0.776
0.545455
0.5
0.432
0.528
0.348
0.532
0.396
0.48
0.636
0.608
0.68
0.4
0.486301
0.472
0.408
0.424
0.455056
0.684
0.368
0.18
0.156
0.384
0.552
0.208459
0.357383
0.328283
0.3663
0.359375
0.395683
0.255083
0.395683
0.403909
0.146341
0.068182
0.208459
0.046429
0.149351
0.404145
0.081481
0.400432
0.436508
0.532
0.355469
0.424
0.255083
37.901287
Qwen3ForCausalLM
6115346c3f9243e44d9fcf2737ff78ec59d535b2
mit
10
483
8.40837
true
true
Qwen/Qwen3-8B-Base
2025-06-10T05:36:23.000Z
transformers
text-generation
False
legmlai/legml-v1.0-8b-instruct
0.481799
0.550785
0.609269
0.904
0.609626
0.7
0.648
0.676
0.54
0.676
0.62
0.628
0.848
0.736
0.632
0.532
0.561644
0.74
0.792
0.608
0.544944
0.7
0.548
0.22
0.192
0.376
0.552
0.403323
0.358221
0.40404
0.355311
0.341518
0.630695
0.499076
0.630695
0.65798
0.349593
0.227273
0.403323
0.142857
0.493506
0.601036
0.2
0.481799
0.40873
0.496
0.308594
0.424
0.499076
48.200635
Qwen3ForCausalLM
5cc8024fe4bfb4a5ed8d6abe0655cf93f8639457
apache-2.0
2
26
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-07-30T09:43:51.000Z
transformers
text-generation
False
Intelligent-Internet/II-Medical-8B-1706
0.464511
0.509923
0.547995
0.872
0.588235
0.552
0.616
0.592
0.416
0.584
0.604
0.58
0.732
0.584
0.724
0.456
0.547945
0.624
0.612
0.524
0.55618
0.656
0.48
0.204
0.188
0.34
0.532
0.412387
0.350671
0.338384
0.338828
0.370536
0.438849
0.297597
0.438849
0.615635
0.357724
0.25
0.412387
0.178571
0.538961
0.606218
0.222222
0.464511
0.470899
0.56
0.34375
0.512
0.297597
44.755211
Qwen3ForCausalLM
a364a7cb987287fad5fefd512da3042e464d74f2
apache-2.0
138
320
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-08-12T07:47:53.000Z
transformers
text-generation
False
DavidAU/Qwen3-8B-64k-Context-2X-Josiefied-Uncensored
0.474235
0.542872
0.600937
0.92
0.59893
0.664
0.544
0.68
0.54
0.768
0.608
0.632
0.852
0.692
0.664
0.48
0.547945
0.708
0.664
0.608
0.629213
0.688
0.692
0.208
0.188
0.292
0.54
0.537009
0.342282
0.368687
0.335165
0.339286
0.495204
0.354898
0.495204
0.807818
0.487805
0.295455
0.537009
0.260714
0.701299
0.73057
0.311111
0.474235
0.416667
0.492
0.300781
0.46
0.354898
47.772236
Qwen3ForCausalLM
b2bea2419c4f85c35f382e56ff6832843aedbbf8
null
12
86
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-07-28T00:10:35.000Z
transformers
text-generation
False
unsloth/Qwen3-8B-Base-bnb-4bit
0.440076
0.522117
0.568478
0.868
0.582888
0.664
0.552
0.576
0.388
0.804
0.616
0.548
0.88
0.776
0.724
0.38
0.616438
0.62
0.66
0.528
0.696629
0.596
0.516
0.164
0.184
0.276
0.488
0.176737
0.345638
0.313131
0.351648
0.352679
0.508393
0.382625
0.508393
0.315961
0.121951
0.045455
0.176737
0.039286
0.227273
0.310881
0.074074
0.440076
0.44709
0.516
0.414063
0.412
0.382625
41.440202
Qwen3ForCausalLM
e3276524f7f4ce7df7b65dd9fcb7552a7f2d6de9
apache-2.0
3
1,538
null
true
false
Qwen/Qwen3-8B-Base
2025-07-14T13:04:35.000Z
transformers
text-generation
False
ccui46/qwen3-8b-warmstart
0.475814
0.547153
0.601458
0.892
0.59893
0.664
0.552
0.696
0.544
0.736
0.62
0.616
0.856
0.716
0.664
0.456
0.60274
0.74
0.72
0.608
0.55618
0.708
0.648
0.2
0.188
0.332
0.508
0.500755
0.368289
0.429293
0.360806
0.350446
0.51199
0.36414
0.51199
0.762215
0.447154
0.310606
0.500755
0.242857
0.584416
0.725389
0.259259
0.475814
0.415344
0.512
0.277344
0.46
0.36414
47.89418
null
null
null
null
null
null
null
null
null
null
null
null
null
attn-signs/Qwen3-8b-ru
0.464096
0.535997
0.591911
0.904
0.604278
0.704
0.568
0.648
0.552
0.668
0.616
0.584
0.844
0.736
0.66
0.496
0.582192
0.7
0.688
0.612
0.58427
0.712
0.476
0.228
0.188
0.308
0.54
0.189577
0.35906
0.409091
0.355311
0.341518
0.549161
0.406654
0.549161
0.364821
0.121951
0.083333
0.189577
0.042857
0.162338
0.357513
0.051852
0.464096
0.388889
0.516
0.304688
0.348
0.406654
42.378231
Qwen3ForCausalLM
b625de3d88f05040ae15298d5d52439aac7b515a
apache-2.0
5
14
8.188548
true
true
Qwen/Qwen3-8B-Base
2025-05-06T13:19:57.000Z
transformers
text-generation
False
tomg-group-umd/DynaGuard-8B
0.460605
0.549098
0.602847
0.876
0.625668
0.588
0.592
0.628
0.476
0.784
0.62
0.6
0.844
0.768
0.64
0.476
0.630137
0.7
0.76
0.616
0.567416
0.664
0.66
0.204
0.24
0.372
0.544
0.486405
0.363255
0.383838
0.351648
0.368304
0.410072
0.260628
0.410072
0.76873
0.447154
0.363636
0.486405
0.2
0.577922
0.663212
0.237037
0.460605
0.43254
0.524
0.296875
0.48
0.260628
45.928721
Qwen3ForCausalLM
8dbd7979edfc81d5b5d97352fb685c71a0e04665
apache-2.0
15
366
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-09-03T15:05:17.000Z
transformers
text-generation
False
Kwaipilot/HiPO-8B
0.383062
0.394474
0.407395
0.816
0.614973
0.404
0.38
0.604
0.304
0.52
0.324
0.44
0.3
0.128
0.636
0.472
0.321918
0.292
0.184
0.324
0.455056
0.668
0.38
0.176
0.148
0.356
0.56
0.398036
0.323826
0.328283
0.331502
0.3125
0.382494
0.243993
0.382494
0.648208
0.365854
0.219697
0.398036
0.164286
0.506494
0.569948
0.148148
0.383062
0.407407
0.488
0.382813
0.352
0.243993
38.370326
Qwen3ForCausalLM
65de90a31bf21f7b7e1a812c674c259f3fa992f3
apache-2.0
22
14
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-11-04T02:37:23.000Z
transformers
text-generation
False
Aratako/Qwen3-8B-NSFW-JP
0.45113
0.521598
0.574032
0.92
0.572193
0.608
0.54
0.688
0.568
0.752
0.616
0.58
0.776
0.732
0.72
0.4
0.486301
0.644
0.62
0.5
0.561798
0.612
0.608
0.248
0.192
0.264
0.528
0.410876
0.345638
0.333333
0.355311
0.339286
0.467626
0.31793
0.467626
0.700326
0.349593
0.265152
0.410876
0.125
0.5
0.585492
0.192593
0.45113
0.399471
0.504
0.308594
0.388
0.31793
44.146219
Qwen3ForCausalLM
610294652e72d3a0a2873e3f481c96dbc1890015
mit
22
732
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-05-08T14:45:23.000Z
transformers
text-generation
False
haoranhe/ROVER-Qwen3-8B
0.474983
0.539759
0.583753
0.896
0.620321
0.688
0.612
0.616
0.456
0.776
0.612
0.58
0.888
0.752
0.684
0.44
0.59589
0.664
0.688
0.612
0.674157
0.66
0.416
0.168
0.18
0.268
0.504
0.302115
0.376678
0.343434
0.39011
0.375
0.549161
0.4122
0.549161
0.521173
0.292683
0.128788
0.302115
0.103571
0.324675
0.471503
0.125926
0.474983
0.46164
0.54
0.414063
0.432
0.4122
45.805496
Qwen3ForCausalLM
8cb8f9baca728e47e99d5f56f0dcf41375185b46
mit
2
6
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-01T12:57:56.000Z
transformers
text-generation
False
Trendyol/Trendyol-LLM-8B-T1
0.430851
0.532624
0.587919
0.9
0.57754
0.604
0.552
0.684
0.548
0.816
0.6
0.516
0.8
0.736
0.728
0.476
0.561644
0.684
0.664
0.612
0.595506
0.68
0.532
0.188
0.188
0.332
0.524
0.351964
0.357383
0.383838
0.357143
0.345982
0.453237
0.297597
0.453237
0.605863
0.300813
0.204545
0.351964
0.132143
0.38961
0.528497
0.125926
0.430851
0.387566
0.532
0.304688
0.328
0.297597
42.815328
Qwen3ForCausalLM
1aeda7229465d71662aacd26e6c6e8f167f683a7
apache-2.0
31
286
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-07-17T14:11:49.000Z
transformers
text-generation
False
SKN14-Final-1Team/qwen3-8b-informal-formal-merged-09-19
0.467836
0.542223
0.595903
0.892
0.631016
0.644
0.632
0.692
0.588
0.596
0.636
0.604
0.832
0.764
0.668
0.468
0.554795
0.716
0.744
0.604
0.533708
0.66
0.536
0.224
0.156
0.388
0.512
0.528701
0.348993
0.358586
0.346154
0.348214
0.515588
0.377079
0.515588
0.765472
0.593496
0.356061
0.528701
0.257143
0.649351
0.720207
0.251852
0.467836
0.437831
0.508
0.289063
0.52
0.377079
48.247528
Qwen3ForCausalLM
870bb666b9c1645e149538d4dfbc2e382874d09e
null
1
1
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-09-19T03:45:08.000Z
transformers
text-generation
False
rahim-xelpmoc/Qwen3-8B-bnb
0.463098
0.544947
0.600243
0.908
0.614973
0.66
0.556
0.652
0.54
0.84
0.636
0.62
0.864
0.744
0.656
0.428
0.541096
0.668
0.592
0.556
0.522472
0.72
0.764
0.224
0.176
0.36
0.52
0.488671
0.363255
0.39899
0.346154
0.368304
0.478417
0.31793
0.478417
0.778502
0.479675
0.318182
0.488671
0.203571
0.571429
0.709845
0.185185
0.463098
0.410053
0.484
0.304688
0.444
0.31793
46.728955
Qwen3ForCausalLM
91a5b740e4c5c6a2149a90b45c1b24e8d3a8cf3d
null
0
17
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-11T04:54:40.000Z
transformers
text-generation
False
faresfawzi/Qwen3-8B-SCRIBE
0.469997
0.530549
0.581149
0.88
0.59893
0.66
0.54
0.66
0.548
0.576
0.632
0.58
0.804
0.72
0.684
0.42
0.582192
0.632
0.652
0.608
0.522472
0.724
0.664
0.172
0.16
0.372
0.544
0.522659
0.359899
0.393939
0.35348
0.352679
0.535971
0.391867
0.535971
0.801303
0.495935
0.340909
0.522659
0.225
0.662338
0.699482
0.296296
0.469997
0.414021
0.504
0.265625
0.476
0.391867
48.061602
Qwen3ForCausalLM
00989133afe8260aa127f1d2d9ace96f69635421
apache-2.0
0
1
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-28T15:53:43.000Z
transformers
text-generation
False
SciReason/SciReasoner-8B
0.295047
0.431055
0.461552
0.708
0.540107
0.552
0.436
0.532
0.188
0.564
0.416
0.388
0.62
0.744
0.592
0.376
0.458904
0.512
0.496
0.328
0.561798
0.632
0.376
0.168
0.124
0.288
0.524
0.018882
0.301174
0.277778
0.305861
0.305804
0.223022
0.136784
0.223022
0.022801
0
0.007576
0.018882
0.010714
0.019481
0.051813
0.007407
0.295047
0.403439
0.516
0.363281
0.332
0.136784
28.385263
Qwen3ForCausalLM
772c4adaf43c750db5ef04d6f567148ca3daf7b0
apache-2.0
8
26
8.190735
true
false
Qwen/Qwen3-8B-Base
2025-09-28T15:24:21.000Z
transformers
text-generation
False
korazer/qwen3-8b-crypto-bot-dpo-merged
0.476729
0.555066
0.609964
0.9
0.588235
0.668
0.56
0.684
0.54
0.764
0.636
0.632
0.848
0.72
0.672
0.48
0.582192
0.724
0.72
0.612
0.573034
0.716
0.664
0.216
0.184
0.4
0.528
0.506042
0.374161
0.429293
0.357143
0.370536
0.454436
0.301294
0.454436
0.762215
0.479675
0.333333
0.506042
0.192857
0.623377
0.751295
0.281481
0.476729
0.421958
0.5
0.292969
0.476
0.301294
47.388163
Qwen3ForCausalLM
fd92c9649542fb4652321bb93fb578d717ca59ca
null
0
0
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-09-25T08:12:22.000Z
transformers
text-generation
False
NaruseShiroha/capybara-instruct-8b
0.452294
0.533921
0.587224
0.8
0.566845
0.596
0.556
0.656
0.568
0.684
0.632
0.58
0.832
0.74
0.7
0.512
0.534247
0.68
0.68
0.588
0.578652
0.652
0.652
0.248
0.2
0.34
0.488
0.135196
0.345638
0.368687
0.335165
0.348214
0.252998
0.131238
0.252998
0.276873
0.01626
0
0.135196
0.05
0.214286
0.207254
0.037037
0.452294
0.424603
0.5
0.320313
0.456
0.131238
36.632551
Qwen3ForCausalLM
d6dacb701b79dfcf587739dc780cf9ccff3760b2
apache-2.0
0
0
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-17T18:35:49.000Z
transformers
text-generation
False
codys12/Qwen3-8B-BitNet
0.1186
0.311195
0.316612
0.468
0.486631
0.248
0.42
0.536
0.104
0.484
0.152
0.164
0.348
0.208
0.58
0.252
0.260274
0.164
0.28
0.148
0.466292
0.536
0.168
0.204
0.116
0.32
0.548
0.03852
0.255034
0.282828
0.258242
0.238839
0.177458
0.099815
0.177458
0.078176
0.02439
0.007576
0.03852
0.007143
0.038961
0.072539
0.007407
0.1186
0.358466
0.532
0.265625
0.28
0.099815
21.078149
Qwen3ForCausalLM
ba16b4ac59b7358353ed556b1ddda193ae21ce80
apache-2.0
16
33
null
true
true
Qwen/Qwen3-8B-Base
2025-07-07T18:26:14.000Z
transformers
text-generation
False
rl-rag/qwen3-8B-sft-mix-v20250915
0.455286
0.523933
0.571081
0.892
0.55615
0.676
0.572
0.648
0.38
0.688
0.604
0.568
0.752
0.672
0.684
0.436
0.541096
0.672
0.752
0.544
0.617978
0.608
0.592
0.216
0.176
0.34
0.516
0.442598
0.370805
0.393939
0.360806
0.372768
0.417266
0.269871
0.417266
0.70684
0.398374
0.30303
0.442598
0.203571
0.512987
0.580311
0.237037
0.455286
0.406085
0.492
0.28125
0.448
0.269871
44.385362
Qwen3ForCausalLM
1b2c4ef0908c90c8b5201178b4007f2794d79e62
other
0
0
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-09-16T21:57:56.000Z
transformers
text-generation
False
inclusionAI/AReaL-boba-2-8B
0.475233
0.556492
0.611699
0.896
0.604278
0.66
0.552
0.68
0.528
0.772
0.644
0.632
0.836
0.724
0.676
0.492
0.59589
0.72
0.752
0.6
0.567416
0.696
0.772
0.204
0.188
0.356
0.512
0.511329
0.370805
0.434343
0.35348
0.363839
0.486811
0.340111
0.486811
0.801303
0.495935
0.340909
0.511329
0.210714
0.603896
0.720207
0.251852
0.475233
0.428571
0.5
0.296875
0.492
0.340111
48.074145
Qwen3ForCausalLM
88ba93ee3a784c91cc1c2f1377f150393bd381d3
apache-2.0
25
6
null
true
true
Qwen/Qwen3-8B-Base
2025-06-13T12:58:00.000Z
transformers
text-generation
False
s3nh/EduHelp-8B
0.457281
0.545985
0.601979
0.88
0.550802
0.66
0.648
0.6
0.54
0.812
0.608
0.584
0.832
0.704
0.68
0.48
0.582192
0.688
0.776
0.592
0.601124
0.668
0.632
0.224
0.196
0.356
0.532
0.202417
0.355705
0.393939
0.351648
0.34375
0.490408
0.338262
0.490408
0.364821
0.146341
0.083333
0.202417
0.057143
0.175325
0.398964
0.051852
0.457281
0.419312
0.492
0.320313
0.448
0.338262
42.118348
Qwen3ForCausalLM
0b1c2b0f1c6ecacd2e931353bc4b4099b88ee544
mit
5
2
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-14T14:01:01.000Z
transformers
text-generation
False
SubconsciousDev/TIM-8b-preview
0.485372
0.558049
0.617601
0.908
0.604278
0.688
0.64
0.68
0.544
0.768
0.636
0.616
0.824
0.708
0.684
0.492
0.568493
0.748
0.756
0.6
0.561798
0.7
0.688
0.236
0.2
0.4
0.532
0.509819
0.355705
0.368687
0.349817
0.357143
0.509592
0.373383
0.509592
0.785016
0.471545
0.378788
0.509819
0.235714
0.61039
0.663212
0.281481
0.485372
0.42328
0.508
0.324219
0.44
0.373383
48.35616
Qwen3ForCausalLM
b54d6dd0d510cb0454145eeddbc293adcb45e6bd
mit
1
221
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-07-23T15:12:11.000Z
transformers
text-generation
False
caiyuchen/DAPO-step-1
0.472656
0.536905
0.583406
0.896
0.620321
0.688
0.6
0.636
0.432
0.78
0.636
0.584
0.888
0.768
0.672
0.436
0.636986
0.656
0.692
0.58
0.634831
0.652
0.472
0.164
0.168
0.248
0.508
0.21148
0.365772
0.353535
0.384615
0.348214
0.523981
0.404806
0.523981
0.355049
0.154472
0.075758
0.21148
0.071429
0.220779
0.430052
0.037037
0.472656
0.452381
0.536
0.421875
0.4
0.404806
43.494598
Qwen3ForCausalLM
47a31997f54912df75c63483c75c85bf9273c417
apache-2.0
0
4
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-03T12:42:04.000Z
transformers
text-generation
False
caiyuchen/DAPO-step-6
0.471576
0.537424
0.582885
0.896
0.625668
0.684
0.604
0.628
0.448
0.772
0.62
0.592
0.892
0.76
0.656
0.432
0.630137
0.652
0.7
0.58
0.657303
0.652
0.476
0.16
0.18
0.244
0.5
0.228852
0.369128
0.348485
0.397436
0.34375
0.5
0.390018
0.5
0.413681
0.219512
0.113636
0.228852
0.064286
0.207792
0.373057
0.088889
0.471576
0.456349
0.532
0.433594
0.404
0.390018
43.479823
Qwen3ForCausalLM
f4e428ba026bda44572bbd89e9bec4dc1c78e9c6
apache-2.0
0
8
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-03T12:42:13.000Z
transformers
text-generation
False
fiyoung/Qwen3-8B-Base
0.476313
0.531197
0.576115
0.888
0.572193
0.648
0.604
0.632
0.476
0.784
0.588
0.564
0.884
0.748
0.7
0.444
0.568493
0.668
0.692
0.58
0.640449
0.648
0.416
0.164
0.184
0.256
0.492
0.234139
0.364094
0.378788
0.342491
0.383929
0.444844
0.340111
0.444844
0.387622
0.170732
0.098485
0.234139
0.092857
0.246753
0.419689
0.088889
0.476313
0.452381
0.52
0.425781
0.412
0.340111
42.464774
Qwen3ForCausalLM
3377874d884a5a2a6f1255525b75b19eebaab5fd
apache-2.0
0
2
8.190735
true
false
Qwen/Qwen3-8B-Base
2025-10-19T10:08:52.000Z
transformers
text-generation
False
YeQingWen/8B-3000data
0.478723
0.534959
0.590522
0.88
0.561497
0.628
0.552
0.696
0.496
0.736
0.608
0.608
0.824
0.74
0.644
0.444
0.59589
0.704
0.772
0.564
0.55618
0.696
0.62
0.228
0.168
0.32
0.516
0.469789
0.347315
0.353535
0.347985
0.34375
0.461631
0.301294
0.461631
0.716612
0.447154
0.333333
0.469789
0.2
0.564935
0.668394
0.22963
0.478723
0.407407
0.504
0.304688
0.416
0.301294
45.923132
Qwen3ForCausalLM
673315c6985109009c451e5d158366950b5bd03e
other
0
5
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-06T09:28:35.000Z
transformers
text-generation
False
caiyuchen/DAPO-step-10
0.475482
0.540278
0.585141
0.888
0.614973
0.696
0.592
0.616
0.452
0.772
0.632
0.596
0.88
0.76
0.664
0.436
0.657534
0.64
0.704
0.584
0.657303
0.66
0.492
0.172
0.18
0.256
0.5
0.226586
0.377517
0.353535
0.410256
0.348214
0.538369
0.425139
0.538369
0.413681
0.170732
0.113636
0.226586
0.060714
0.207792
0.409326
0.066667
0.475482
0.455026
0.528
0.429688
0.408
0.425139
44.302036
Qwen3ForCausalLM
f3b40be82f6169ab38cc441ab431a7d57cbfd9f1
apache-2.0
0
1
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-03T12:42:21.000Z
transformers
text-generation
False
caiyuchen/DAPO-step-2
0.471991
0.535737
0.580976
0.892
0.620321
0.688
0.6
0.616
0.436
0.76
0.612
0.584
0.888
0.764
0.66
0.424
0.643836
0.66
0.708
0.584
0.651685
0.66
0.468
0.152
0.18
0.244
0.504
0.209215
0.370805
0.343434
0.404762
0.341518
0.514388
0.38817
0.514388
0.364821
0.162602
0.106061
0.209215
0.057143
0.214286
0.38342
0.059259
0.471991
0.451058
0.532
0.425781
0.396
0.38817
43.307224
Qwen3ForCausalLM
9a15f5d76e5be094116f10fa265a818c18e8c344
apache-2.0
0
4
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-03T12:42:06.000Z
transformers
text-generation
False
ForSureTesterSim/Deepseek-NSFW-Qwen3
0.395279
0.484758
0.527513
0.868
0.57754
0.544
0.448
0.616
0.464
0.6
0.548
0.544
0.812
0.668
0.664
0.372
0.452055
0.596
0.336
0.452
0.477528
0.724
0.568
0.284
0.176
0.348
0.488
0.323263
0.325503
0.388889
0.302198
0.325893
0.429257
0.28281
0.429257
0.592834
0.170732
0.189394
0.323263
0.114286
0.337662
0.492228
0.155556
0.395279
0.410053
0.496
0.320313
0.416
0.28281
40.181126
Qwen3ForCausalLM
16d820e96e13c36f926eff15caddde75a6431220
null
3
41
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-09-18T07:48:16.000Z
transformers
text-generation
False
caiyuchen/DAPO-step-7
0.472656
0.535478
0.581149
0.892
0.614973
0.696
0.588
0.636
0.448
0.772
0.632
0.588
0.876
0.748
0.656
0.432
0.643836
0.644
0.708
0.58
0.646067
0.652
0.464
0.156
0.176
0.26
0.492
0.222054
0.371644
0.368687
0.401099
0.337054
0.514388
0.393715
0.514388
0.381107
0.219512
0.083333
0.222054
0.060714
0.233766
0.393782
0.074074
0.472656
0.445767
0.528
0.417969
0.392
0.393715
43.460995
Qwen3ForCausalLM
b9f39aa8a022a6e3054585133c215048ffbca5de
apache-2.0
0
2
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-03T12:42:15.000Z
transformers
text-generation
False
kazuyamaa/Qwen3-8B-Math-GRPO
0.466672
0.547542
0.60493
0.912
0.609626
0.676
0.624
0.664
0.54
0.636
0.62
0.616
0.844
0.76
0.688
0.464
0.616438
0.716
0.788
0.616
0.634831
0.716
0.452
0.24
0.204
0.38
0.516
0.519637
0.359899
0.383838
0.364469
0.34375
0.377698
0.269871
0.377698
0.785016
0.577236
0.356061
0.519637
0.2
0.668831
0.720207
0.22963
0.466672
0.406085
0.52
0.285156
0.416
0.269871
45.58202
Qwen3ForCausalLM
cfd72520d1751ef5fe99de1a36ec10a093213933
apache-2.0
0
24
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-19T08:37:38.000Z
transformers
text-generation
False
caiyuchen/DAPO-step-9
0.472989
0.535608
0.580628
0.896
0.625668
0.692
0.592
0.62
0.448
0.752
0.624
0.576
0.88
0.752
0.66
0.42
0.630137
0.656
0.704
0.572
0.634831
0.656
0.488
0.172
0.18
0.26
0.492
0.215257
0.370805
0.343434
0.393773
0.354911
0.538369
0.4122
0.538369
0.364821
0.211382
0.075758
0.215257
0.075
0.207792
0.388601
0.066667
0.472989
0.452381
0.528
0.425781
0.404
0.4122
43.840491
Qwen3ForCausalLM
dad6ed6175cea3b2f2f9367a087e69a0c919a982
apache-2.0
0
2
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-03T12:42:19.000Z
transformers
text-generation
False
caiyuchen/DAPO-step-4
0.47249
0.537554
0.583579
0.892
0.614973
0.684
0.596
0.644
0.444
0.788
0.616
0.588
0.884
0.756
0.668
0.432
0.636986
0.652
0.708
0.588
0.651685
0.656
0.472
0.16
0.176
0.252
0.496
0.23565
0.372483
0.343434
0.406593
0.34375
0.510791
0.38817
0.510791
0.442997
0.170732
0.090909
0.23565
0.057143
0.188312
0.42487
0.118519
0.47249
0.44709
0.536
0.421875
0.384
0.38817
43.701389
Qwen3ForCausalLM
05948813e0fddb033cc7c05170d6cb8d58d31166
apache-2.0
0
1
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-03T12:42:09.000Z
transformers
text-generation
False
caiyuchen/DAPO-step-3
0.472906
0.538591
0.585836
0.896
0.625668
0.684
0.592
0.632
0.432
0.784
0.628
0.6
0.884
0.756
0.668
0.444
0.650685
0.656
0.7
0.584
0.634831
0.664
0.476
0.168
0.184
0.264
0.504
0.206193
0.366611
0.343434
0.39011
0.348214
0.496403
0.371534
0.496403
0.37785
0.138211
0.068182
0.206193
0.046429
0.24026
0.357513
0.088889
0.472906
0.449735
0.532
0.425781
0.392
0.371534
42.961397
Qwen3ForCausalLM
81639126ceb5d00922ded48ac9d17be16d9e8d44
apache-2.0
0
4
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-03T12:42:08.000Z
transformers
text-generation
False
caiyuchen/DAPO-step-8
0.472158
0.536386
0.582885
0.896
0.609626
0.684
0.596
0.616
0.44
0.768
0.624
0.588
0.884
0.76
0.656
0.432
0.636986
0.656
0.7
0.596
0.657303
0.652
0.492
0.172
0.172
0.256
0.496
0.219789
0.368289
0.353535
0.399267
0.337054
0.522782
0.415896
0.522782
0.381107
0.195122
0.090909
0.219789
0.067857
0.175325
0.38342
0.133333
0.472158
0.44709
0.532
0.421875
0.388
0.415896
43.549855
Qwen3ForCausalLM
335786b97fa76f749f2616f18d89050f484e7d42
apache-2.0
0
0
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-03T12:42:17.000Z
transformers
text-generation
False
caiyuchen/DAPO-step-5
0.471493
0.537164
0.582191
0.896
0.631016
0.692
0.596
0.632
0.444
0.776
0.616
0.576
0.884
0.752
0.66
0.44
0.623288
0.648
0.712
0.58
0.646067
0.648
0.476
0.172
0.156
0.26
0.504
0.219789
0.371644
0.358586
0.399267
0.34375
0.509592
0.386322
0.509592
0.400651
0.154472
0.136364
0.219789
0.05
0.201299
0.388601
0.081481
0.471493
0.455026
0.532
0.425781
0.408
0.386322
43.495581
Qwen3ForCausalLM
f6e135723b73901fffbfa71f9643612035ad070f
apache-2.0
0
2
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-03T12:42:11.000Z
transformers
text-generation
False
caiyuchen/DAPO-step-0
0.472656
0.536905
0.583406
0.896
0.620321
0.688
0.6
0.636
0.432
0.78
0.636
0.584
0.888
0.768
0.672
0.436
0.636986
0.656
0.692
0.58
0.634831
0.652
0.472
0.164
0.168
0.248
0.508
0.21148
0.365772
0.353535
0.384615
0.348214
0.523981
0.404806
0.523981
0.355049
0.154472
0.075758
0.21148
0.071429
0.220779
0.430052
0.037037
0.472656
0.452381
0.536
0.421875
0.4
0.404806
43.494598
Qwen3ForCausalLM
85f4876c54c939f085fcc7fee383fc7c6b379490
apache-2.0
0
3
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-03T12:42:02.000Z
transformers
text-generation
False
dnotitia/Smoothie-Qwen3-8B
0.476978
0.553249
0.607186
0.9
0.59893
0.676
0.556
0.68
0.54
0.784
0.632
0.62
0.836
0.72
0.68
0.48
0.60274
0.72
0.72
0.588
0.561798
0.712
0.66
0.22
0.172
0.364
0.532
0.504532
0.368289
0.444444
0.35348
0.352679
0.491607
0.336414
0.491607
0.781759
0.471545
0.318182
0.504532
0.228571
0.577922
0.704663
0.288889
0.476978
0.433862
0.524
0.300781
0.48
0.336414
48.040896
Qwen3ForCausalLM
a3d318d4fd455c8b213e8978b6b9cd50717ab6b0
apache-2.0
13
8
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-05-04T15:04:34.000Z
transformers
text-generation
False
telecomadm1145/Qwen3-8B-Novel-AdaLoRA
0.476978
0.551952
0.608401
0.908
0.59893
0.684
0.596
0.676
0.552
0.788
0.616
0.62
0.84
0.74
0.672
0.484
0.609589
0.724
0.732
0.596
0.589888
0.692
0.588
0.228
0.176
0.36
0.524
0.484894
0.362416
0.409091
0.355311
0.350446
0.484412
0.332717
0.484412
0.762215
0.487805
0.280303
0.484894
0.185714
0.623377
0.694301
0.214815
0.476978
0.420635
0.504
0.292969
0.468
0.332717
47.295619
null
c1dd0d05ee01fb09def7a6f390f6da622d6c151f
null
0
0
null
true
true
Qwen/Qwen3-8B-Base
2025-10-03T03:08:34.000Z
peft
text-generation
False
caiyuchen/DAPO-step-14
0.476147
0.536645
0.582711
0.892
0.614973
0.676
0.588
0.604
0.46
0.788
0.624
0.596
0.872
0.756
0.66
0.42
0.630137
0.664
0.708
0.58
0.651685
0.652
0.496
0.16
0.188
0.256
0.496
0.23716
0.366611
0.328283
0.391941
0.352679
0.52518
0.417745
0.52518
0.446254
0.227642
0.075758
0.23716
0.067857
0.214286
0.388601
0.088889
0.476147
0.453704
0.532
0.421875
0.408
0.417745
44.025212
Qwen3ForCausalLM
2784c323f55c9edd89383e2b36be1373bd418bf1
apache-2.0
0
0
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-03T12:42:29.000Z
transformers
text-generation
False
allstax/editorial-qwe3-8b-v2
0.476978
0.553509
0.607186
0.9
0.59893
0.676
0.556
0.68
0.54
0.784
0.632
0.62
0.836
0.72
0.68
0.48
0.60274
0.72
0.72
0.588
0.561798
0.712
0.66
0.22
0.172
0.364
0.532
0.506798
0.369128
0.444444
0.355311
0.352679
0.482014
0.35305
0.482014
0.781759
0.487805
0.325758
0.506798
0.225
0.616883
0.683938
0.281481
0.476978
0.435185
0.524
0.300781
0.484
0.35305
47.954816
Qwen3ForCausalLM
c33983431344f37b213f1e51f320a5dc1b8479b1
apache-2.0
0
0
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-08-31T19:40:02.000Z
transformers
text-generation
False
maidacundo/annie-lite-v0.4.7-grpo-540-qwen3-8b
0.461021
0.538202
0.598681
0.916
0.582888
0.668
0.612
0.616
0.496
0.812
0.624
0.6
0.808
0.696
0.656
0.448
0.609589
0.652
0.796
0.64
0.544944
0.72
0.564
0.2
0.208
0.352
0.532
0.492447
0.350671
0.414141
0.327839
0.350446
0.545564
0.384473
0.545564
0.781759
0.479675
0.30303
0.492447
0.221429
0.545455
0.683938
0.259259
0.461021
0.373016
0.5
0.253906
0.368
0.384473
47.023318
Qwen3ForCausalLM
580815b02bbd63bb317392f56fa707c2f1c7d502
apache-2.0
0
0
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-18T15:38:46.000Z
transformers
text-generation
False
miromind-ai/MiroThinker-8B-SFT-v0.1
0.469498
0.537424
0.591217
0.912
0.593583
0.668
0.496
0.68
0.416
0.748
0.62
0.596
0.832
0.728
0.712
0.444
0.59589
0.708
0.684
0.536
0.55618
0.728
0.656
0.24
0.216
0.26
0.556
0.483384
0.35906
0.40404
0.357143
0.341518
0.513189
0.362292
0.513189
0.765472
0.463415
0.30303
0.483384
0.217857
0.558442
0.673575
0.22963
0.469498
0.40873
0.492
0.335938
0.4
0.362292
47.084642
null
null
null
null
null
null
null
null
null
null
null
null
null
DCAgent/freelancer-projects-0-1k-traces
0.473487
0.54832
0.60302
0.896
0.588235
0.68
0.548
0.676
0.572
0.772
0.636
0.612
0.868
0.72
0.676
0.48
0.561644
0.704
0.756
0.608
0.58427
0.692
0.548
0.232
0.172
0.352
0.512
0.503776
0.368289
0.409091
0.351648
0.370536
0.471223
0.325323
0.471223
0.781759
0.455285
0.30303
0.503776
0.253571
0.616883
0.678756
0.251852
0.473487
0.415344
0.504
0.289063
0.456
0.325323
47.252327
Qwen3ForCausalLM
9cdf7fe5b46811835dfc75e6de335b1b9beea9f0
apache-2.0
0
8
0.000308
true
true
Qwen/Qwen3-8B-Base
2025-11-11T14:17:18.000Z
transformers
text-generation
False
soo1206/title-8b-3054
0.482962
0.54378
0.597292
0.904
0.57754
0.68
0.528
0.676
0.52
0.728
0.62
0.604
0.844
0.732
0.688
0.468
0.589041
0.732
0.76
0.604
0.55618
0.656
0.616
0.204
0.18
0.316
0.532
0.494713
0.360738
0.393939
0.358974
0.348214
0.502398
0.365989
0.502398
0.76873
0.504065
0.333333
0.494713
0.217857
0.616883
0.658031
0.222222
0.482962
0.424603
0.512
0.304688
0.46
0.365989
47.711779
Qwen3ForCausalLM
9640c4453b20147f3dbcbbe74c50e3310c755cd1
null
0
0
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-09-26T12:37:50.000Z
transformers
text-generation
False
yueqis/agenttuning-qwen3-8b-16k-1e-5
0.475648
0.552212
0.605971
0.908
0.604278
0.692
0.572
0.688
0.556
0.772
0.644
0.62
0.848
0.724
0.676
0.476
0.575342
0.704
0.764
0.608
0.544944
0.692
0.54
0.224
0.192
0.364
0.524
0.501511
0.379195
0.429293
0.371795
0.366071
0.492806
0.340111
0.492806
0.775244
0.463415
0.348485
0.501511
0.214286
0.61039
0.663212
0.303704
0.475648
0.415344
0.492
0.277344
0.48
0.340111
47.841239
Qwen3ForCausalLM
d97051189d79501247f9c5c4f010bc5a7ec5877e
other
0
0
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-09-24T15:22:24.000Z
transformers
text-generation
False
inclusionAI/AReaL-boba-2-8B-Open
0.479804
0.557141
0.613435
0.896
0.614973
0.66
0.572
0.688
0.544
0.768
0.624
0.612
0.84
0.732
0.664
0.488
0.609589
0.744
0.74
0.596
0.578652
0.712
0.74
0.208
0.184
0.372
0.524
0.516616
0.366611
0.414141
0.357143
0.357143
0.479616
0.334566
0.479616
0.778502
0.495935
0.30303
0.516616
0.235714
0.694805
0.699482
0.266667
0.479804
0.428571
0.516
0.292969
0.48
0.334566
48.077564
Qwen3ForCausalLM
0a2d6b38229e58fac8213d3ce24789f796693858
apache-2.0
19
8
null
true
true
Qwen/Qwen3-8B-Base
2025-06-04T16:44:09.000Z
transformers
text-generation
False
maidacundo/annie-lite-v0.4.2-sft-qwen3-8b
0.456449
0.533013
0.592258
0.908
0.566845
0.664
0.6
0.64
0.516
0.756
0.608
0.6
0.812
0.672
0.656
0.464
0.60274
0.676
0.792
0.592
0.578652
0.728
0.54
0.192
0.168
0.348
0.528
0.470544
0.35151
0.40404
0.347985
0.332589
0.532374
0.365989
0.532374
0.745928
0.414634
0.318182
0.470544
0.210714
0.512987
0.65285
0.274074
0.456449
0.367725
0.504
0.253906
0.348
0.365989
46.18101
Qwen3ForCausalLM
8bc2694cb3fb313626b46e0092b1236152a21899
apache-2.0
0
0
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-09T06:49:36.000Z
transformers
text-generation
False
caiyuchen/DAPO-step-16
0.475898
0.536645
0.582364
0.884
0.620321
0.684
0.588
0.612
0.452
0.764
0.628
0.572
0.892
0.764
0.656
0.436
0.630137
0.664
0.712
0.58
0.651685
0.66
0.488
0.156
0.176
0.26
0.496
0.23716
0.36745
0.363636
0.388278
0.34375
0.535971
0.408503
0.535971
0.429967
0.203252
0.128788
0.23716
0.057143
0.227273
0.388601
0.103704
0.475898
0.455026
0.532
0.421875
0.412
0.408503
44.231154
Qwen3ForCausalLM
9be4c68b43bd954fda28af120654a43e511ca499
apache-2.0
0
2
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-03T12:42:33.000Z
transformers
text-generation
False
caiyuchen/DAPO-step-11
0.473487
0.534959
0.581323
0.896
0.604278
0.684
0.596
0.612
0.468
0.748
0.62
0.576
0.88
0.756
0.66
0.424
0.636986
0.66
0.712
0.584
0.646067
0.656
0.48
0.156
0.172
0.268
0.504
0.216012
0.364933
0.343434
0.397436
0.334821
0.551559
0.439926
0.551559
0.410423
0.170732
0.037879
0.216012
0.067857
0.24026
0.352332
0.074074
0.473487
0.449735
0.528
0.421875
0.4
0.439926
43.95082
Qwen3ForCausalLM
8aa0b101faaaa296fa7ca562ea26fdf9fa2d9cb8
apache-2.0
0
2
8.190735
true
true
Qwen/Qwen3-8B-Base
2025-10-03T12:42:23.000Z
transformers
text-generation
False