id stringlengths 2 115 | author stringlengths 2 42 ⌀ | last_modified timestamp[us, tz=UTC] | downloads int64 0 8.87M | likes int64 0 3.84k | paperswithcode_id stringlengths 2 45 ⌀ | tags list | lastModified timestamp[us, tz=UTC] | createdAt stringlengths 24 24 | key stringclasses 1 value | created timestamp[us] | card stringlengths 1 1.01M | embedding list | library_name stringclasses 21 values | pipeline_tag stringclasses 27 values | mask_token null | card_data null | widget_data null | model_index null | config null | transformers_info null | spaces null | safetensors null | transformersInfo null | modelId stringlengths 5 111 ⌀ | embeddings list |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Gus1993ever/Khrome1 | Gus1993ever | 2023-11-25T17:33:08Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T17:33:08Z | 2023-11-25T17:31:05.000Z | 2023-11-25T17:31:05 | ---
license: openrail
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Gabriel1322/jotac | Gabriel1322 | 2023-11-25T17:34:33Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T17:34:33Z | 2023-11-25T17:33:44.000Z | 2023-11-25T17:33:44 | ---
license: openrail
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
stanmalkinson199/2Dphasetwo | stanmalkinson199 | 2023-11-25T17:36:22Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T17:36:22Z | 2023-11-25T17:35:45.000Z | 2023-11-25T17:35:45 | ---
license: openrail
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
drscotthawley/SignalTrain-LA2A | drscotthawley | 2023-11-25T17:59:40Z | 0 | 0 | null | [
"license:cc-by-4.0",
"arxiv:1905.11928",
"arxiv:2102.06200",
"arxiv:2006.05584",
"region:us"
] | 2023-11-25T17:59:40Z | 2023-11-25T17:48:25.000Z | 2023-11-25T17:48:25 | ---
license: cc-by-4.0
---
# SignalTrain LA2A Dataset (v.1.1)
> Downloadable from https://zenodo.org/records/3824876
20 GB of audio in & audio out for a LA-2A compressor unit, conditioned on knob variations.
LA-2A Compressor data to accompany the paper "SignalTrain: Profiling Audio Compressors with Deep Neural Networks," 147th Audio Engineering Society Convention (AES), 2019. https://arxiv.org/abs/1905.11928
Accompanying computer code: https://github.com/drscotthawley/signaltrain
A collection of recorded data from an analog Teletronix LA-2A opto-electronic compressor, for various settings of the Peak Reduction knob. Other knobs were kept constant.
Audio samples present in these files are either 'randomly generated', or downloaded audio clips with Create Commons licenses, or are property of Scott Hawley freely distributed as part of this dataset.
Data taken by Ben Colburn, supervised by Scott Hawley
## Revisions in v.1.1 of dataset:
Made the following corrections to discrepancies in original dataset:
Only one of file: 235, 236
$ rm Train/target_235_LA2A_2c__0__70.wav
$ rm Val/input_236_.wav
In wrong directory: 245
$ mv Train/input_245_.wav Val/
Mismatched length and time alignment: 148, 148, 149, 150, 152
All were had targets delayed by 8583 samples relative to inputs, and were shorter.
Truncated beginning of inputs to make them the same as targets. Used new script check_dataset.py to fix & overwrite earlier version:
$ signaltrain/utils/check_dataset.py --fix SignalTrain_LA2A_Dataset/
## Papers dataset was used in:
"Efficient neural networks for real-time analog audio effect modeling" by C. Steinmetz & J. Reiss, 2021. https://arxiv.org/abs/2102.06200
“Exploring quality and generalizability in parameterized neural audio effects," by W. Mitchell and S. H. Hawley, 149th Audio Engineering Society Convention (AES), 2020. https://arxiv.org/abs/2006.05584
"SignalTrain: Profiling Audio Compressors with Deep Neural Networks," 147th Audio Engineering Society Convention (AES), 2019. https://arxiv.org/abs/1905.11928 | [
-0.42003339529037476,
-0.32123786211013794,
0.22593984007835388,
0.3212929666042328,
0.09005412459373474,
-0.11969868838787079,
-0.2881958782672882,
-0.46579477190971375,
0.24683278799057007,
0.26798173785209656,
-0.829846203327179,
-0.015790283679962158,
-0.5199086666107178,
-0.2605529427... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Ichsan2895/DPO_ID-Wiki_10kTesting | Ichsan2895 | 2023-11-25T18:19:29Z | 0 | 0 | null | [
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2023-11-25T18:19:29Z | 2023-11-25T17:55:48.000Z | 2023-11-25T17:55:48 | ---
license: cc-by-nc-sa-4.0
---
## HOW TO WRANGLING THIS DATASET TO DPO & CHATML FORMAT
```
def return_prompt_and_responses(samples) -> dict[str, str, str]:
return {
"prompt": [
"<|im_start|>user\n" + i + "<|im_end|>\n"
for i in samples["PROMPT"]
],
"chosen": [
"<|im_start|>assistant\n" + j + "<|im_end|>"
for j in samples["CHOSEN"]
],
"rejected": [
"<|im_start|>assistant\n" + k + "<|im_end|>"
for k in samples["REJECTED"]
],
}
dataset = load_dataset(
"Ichsan2895/DPO_ID-Wiki_10kTesting",
)
original_columns = dataset.column_names
dataset.map(
return_prompt_and_responses,
batched=True,
remove_columns=original_columns
)
```
## HOW TO USE DPO
```
dpo_trainer = DPOTrainer(
model, # base model from SFT pipeline
model_ref, # typically a copy of the SFT trained base model
beta=0.1, # temperature hyperparameter of DPO
train_dataset=dataset['train'], # dataset prepared above
tokenizer=tokenizer, # tokenizer
args=training_args, # training arguments e.g. batch size, lr, etc.
)
```
## CITATION
```
@ONLINE{wikidump,
author = "Wikimedia Foundation",
title = "Wikimedia Downloads",
url = "https://dumps.wikimedia.org"
}
@misc{vonwerra2022trl,
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang},
title = {TRL: Transformer Reinforcement Learning},
year = {2020},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/huggingface/trl}}
}
``` | [
-0.12436157464981079,
-0.6176967620849609,
0.20499010384082794,
0.3497546911239624,
-0.03628838434815407,
0.16717781126499176,
-0.19733236730098724,
0.20319098234176636,
-0.04918521270155907,
0.3703615069389343,
-0.7524189352989197,
-0.46785441040992737,
-0.5413615107536316,
0.291599065065... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
breno30/wandaRV | breno30 | 2023-11-25T18:03:17Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T18:03:17Z | 2023-11-25T18:02:13.000Z | 2023-11-25T18:02:13 | ---
license: openrail
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Gabriel1322/jotacee | Gabriel1322 | 2023-11-25T18:06:58Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T18:06:58Z | 2023-11-25T18:04:50.000Z | 2023-11-25T18:04:50 | ---
license: openrail
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Marcos7fytyg/Dataset.Pain | Marcos7fytyg | 2023-11-25T18:13:56Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-25T18:13:56Z | 2023-11-25T18:08:55.000Z | 2023-11-25T18:08:55 | ---
license: apache-2.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Gabriel1322/joaocaetano | Gabriel1322 | 2023-11-25T18:19:55Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T18:19:55Z | 2023-11-25T18:19:10.000Z | 2023-11-25T18:19:10 | ---
license: openrail
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Gabriel1322/joaocaetano22 | Gabriel1322 | 2023-11-25T18:24:04Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T18:24:04Z | 2023-11-25T18:24:04.000Z | 2023-11-25T18:24:04 | ---
license: openrail
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
tennant/C-VQA | tennant | 2023-11-27T16:25:56Z | 0 | 0 | null | [
"region:us"
] | 2023-11-27T16:25:56Z | 2023-11-25T18:31:51.000Z | 2023-11-25T18:31:51 | ---
configs:
- config_name: default
data_files:
- split: test
path: "C-VQA-Real_questions.csv"
---
The dataset repo contains the data for C-VQA-Real dataset, for complete data and evaluating your model on our dataset, please refer to https://github.com/Letian2003/C-VQA.
| [
-0.30378279089927673,
-0.3087718188762665,
0.5307582020759583,
-0.07806499302387238,
-0.203391432762146,
0.03153309226036072,
0.027129871770739555,
-0.37578949332237244,
0.1044825091958046,
1.1731795072555542,
-1.1571027040481567,
-0.7288411855697632,
-0.060224082320928574,
-0.238865971565... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Geofab/ArtieAbams | Geofab | 2023-11-25T20:33:35Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T20:33:35Z | 2023-11-25T18:41:10.000Z | 2023-11-25T18:41:10 | Entry not found | [
-0.3227647542953491,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965083122253,
0.7915717959403992,
0.07618629932403564,
0.7746022343635559,
0.2563222348690033,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
chavanarvind/faces | chavanarvind | 2023-11-25T18:43:19Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-25T18:43:19Z | 2023-11-25T18:43:19.000Z | 2023-11-25T18:43:19 | ---
license: apache-2.0
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
MJFMBR/MJ | MJFMBR | 2023-11-25T19:02:17Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T19:02:17Z | 2023-11-25T19:01:43.000Z | 2023-11-25T19:01:43 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Carslos/procesos | Carslos | 2023-11-25T19:19:46Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T19:19:46Z | 2023-11-25T19:19:46.000Z | 2023-11-25T19:19:46 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
lidiashishina/car-parts-segmentation-tck28 | lidiashishina | 2023-11-25T19:21:10Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-25T19:21:10Z | 2023-11-25T19:21:10.000Z | 2023-11-25T19:21:10 | ---
license: apache-2.0
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
M4R10dziri/commonvoice_diacritized_v3 | M4R10dziri | 2023-11-25T19:31:08Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T19:31:08Z | 2023-11-25T19:31:08.000Z | 2023-11-25T19:31:08 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
harpreetsahota/elicit-bias-prompts | harpreetsahota | 2023-11-25T19:54:25Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T19:54:25Z | 2023-11-25T19:42:28.000Z | 2023-11-25T19:42:28 | ---
dataset_info:
features:
- name: Prompt
dtype: string
splits:
- name: train
num_bytes: 3851
num_examples: 64
download_size: 2447
dataset_size: 3851
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
xanetti/cars_dataset | xanetti | 2023-11-25T19:46:11Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-25T19:46:11Z | 2023-11-25T19:46:11.000Z | 2023-11-25T19:46:11 | ---
license: mit
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Pablao0948/Marina_Sena | Pablao0948 | 2023-11-25T19:51:50Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T19:51:50Z | 2023-11-25T19:50:23.000Z | 2023-11-25T19:50:23 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
DataStudio/OCRNameEntityRed | DataStudio | 2023-11-25T19:59:13Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T19:59:13Z | 2023-11-25T19:57:41.000Z | 2023-11-25T19:57:41 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Norod78/WeirdOutfitStyle | Norod78 | 2023-11-25T20:05:22Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T20:05:22Z | 2023-11-25T20:01:26.000Z | 2023-11-25T20:01:26 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
harpreetsahota/adversarial-prompts | harpreetsahota | 2023-11-25T20:14:20Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T20:14:20Z | 2023-11-25T20:14:19.000Z | 2023-11-25T20:14:19 | ---
dataset_info:
features:
- name: Prompt
dtype: string
splits:
- name: train
num_bytes: 2366
num_examples: 37
download_size: 2228
dataset_size: 2366
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
DataStudio/OCRNameEntityRed_part2 | DataStudio | 2023-11-25T20:15:41Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T20:15:41Z | 2023-11-25T20:15:38.000Z | 2023-11-25T20:15:38 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: '14'
num_bytes: 9326603.25
num_examples: 1150
- name: '30'
num_bytes: 6985940.25
num_examples: 1150
- name: Oxford
num_bytes: 6934480.25
num_examples: 1150
download_size: 23479178
dataset_size: 23247023.75
configs:
- config_name: default
data_files:
- split: '14'
path: data/14-*
- split: '30'
path: data/30-*
- split: Oxford
path: data/Oxford-*
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
tonysun9/flickr-t5 | tonysun9 | 2023-11-25T20:18:28Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-25T20:18:28Z | 2023-11-25T20:18:25.000Z | 2023-11-25T20:18:25 | ---
license: apache-2.0
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ansah4525/earrings1 | ansah4525 | 2023-11-25T20:33:10Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T20:33:10Z | 2023-11-25T20:33:10.000Z | 2023-11-25T20:33:10 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Gabriel1322/joaocaetanodataset | Gabriel1322 | 2023-11-25T21:30:21Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T21:30:21Z | 2023-11-25T20:56:03.000Z | 2023-11-25T20:56:03 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Xokito/datasetfalando | Xokito | 2023-11-25T20:56:43Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T20:56:43Z | 2023-11-25T20:56:03.000Z | 2023-11-25T20:56:03 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Xokito/datasetcantando | Xokito | 2023-11-25T20:57:30Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T20:57:30Z | 2023-11-25T20:56:59.000Z | 2023-11-25T20:56:59 | ---
license: openrail
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
boborr/FLUTTER | boborr | 2023-11-25T21:20:18Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T21:20:18Z | 2023-11-25T21:19:42.000Z | 2023-11-25T21:19:42 | ---
license: openrail
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
InfernoDeep/MesogenesTask1Parquet_ALL_Labels_224x224 | InfernoDeep | 2023-11-25T21:36:38Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T21:36:38Z | 2023-11-25T21:29:05.000Z | 2023-11-25T21:29:05 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_NurtureAI__Orca-2-7B-16k_public | open-llm-leaderboard | 2023-11-25T21:42:51Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T21:42:51Z | 2023-11-25T21:42:06.000Z | 2023-11-25T21:42:06 | ---
pretty_name: Evaluation run of NurtureAI/Orca-2-7B-16k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NurtureAI/Orca-2-7B-16k](https://huggingface.co/NurtureAI/Orca-2-7B-16k) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NurtureAI__Orca-2-7B-16k_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-25T21:39:02.599324](https://huggingface.co/datasets/open-llm-leaderboard/details_NurtureAI__Orca-2-7B-16k_public/blob/main/results_2023-11-25T21-39-02.599324.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.36746546712957223,\n\
\ \"acc_stderr\": 0.033751277531008754,\n \"acc_norm\": 0.3738175555586316,\n\
\ \"acc_norm_stderr\": 0.03459812342976094,\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.45373679597767685,\n\
\ \"mc2_stderr\": 0.015753224924844992,\n \"em\": 0.21046560402684564,\n\
\ \"em_stderr\": 0.004174608410380015,\n \"f1\": 0.267364723154363,\n\
\ \"f1_stderr\": 0.004242093940617827\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4735494880546075,\n \"acc_stderr\": 0.014590931358120174,\n\
\ \"acc_norm\": 0.5059726962457338,\n \"acc_norm_stderr\": 0.014610348300255795\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.47410874327823144,\n\
\ \"acc_stderr\": 0.004983087049281742,\n \"acc_norm\": 0.6389165504879506,\n\
\ \"acc_norm_stderr\": 0.00479333052565621\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.0403356566784832,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.0403356566784832\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.39622641509433965,\n \"acc_stderr\": 0.030102793781791194,\n\
\ \"acc_norm\": 0.39622641509433965,\n \"acc_norm_stderr\": 0.030102793781791194\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3819444444444444,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.3819444444444444,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.3468208092485549,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.02951319662553935,\n\
\ \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.02951319662553935\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948368,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948368\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.037649508797906066,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.037649508797906066\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.02786932057166463,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.02786932057166463\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n\
\ \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.038881769216741004,\n\
\ \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.038881769216741004\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4292929292929293,\n \"acc_stderr\": 0.03526552724601198,\n \"\
acc_norm\": 0.4292929292929293,\n \"acc_norm_stderr\": 0.03526552724601198\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5284974093264249,\n \"acc_stderr\": 0.03602573571288441,\n\
\ \"acc_norm\": 0.5284974093264249,\n \"acc_norm_stderr\": 0.03602573571288441\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.024035489676335065,\n\
\ \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.024035489676335065\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.02995382389188704,\n \
\ \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.02995382389188704\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603826,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603826\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5229357798165137,\n \"acc_stderr\": 0.0214147570581755,\n \"acc_norm\"\
: 0.5229357798165137,\n \"acc_norm_stderr\": 0.0214147570581755\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2916666666666667,\n\
\ \"acc_stderr\": 0.03099866630456052,\n \"acc_norm\": 0.2916666666666667,\n\
\ \"acc_norm_stderr\": 0.03099866630456052\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.03492406104163613,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.03492406104163613\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6033755274261603,\n \"acc_stderr\": 0.03184399873811226,\n \
\ \"acc_norm\": 0.6033755274261603,\n \"acc_norm_stderr\": 0.03184399873811226\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4304932735426009,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.4304932735426009,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.45038167938931295,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.45038167938931295,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4049586776859504,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.4049586776859504,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.04750077341199986,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.04750077341199986\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3803680981595092,\n \"acc_stderr\": 0.03814269893261837,\n\
\ \"acc_norm\": 0.3803680981595092,\n \"acc_norm_stderr\": 0.03814269893261837\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.36893203883495146,\n \"acc_stderr\": 0.047776151811567386,\n\
\ \"acc_norm\": 0.36893203883495146,\n \"acc_norm_stderr\": 0.047776151811567386\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.43162393162393164,\n\
\ \"acc_stderr\": 0.0324483553531149,\n \"acc_norm\": 0.43162393162393164,\n\
\ \"acc_norm_stderr\": 0.0324483553531149\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.40485312899106,\n\
\ \"acc_stderr\": 0.017553246467720256,\n \"acc_norm\": 0.40485312899106,\n\
\ \"acc_norm_stderr\": 0.017553246467720256\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3959537572254335,\n \"acc_stderr\": 0.026329813341946243,\n\
\ \"acc_norm\": 0.3959537572254335,\n \"acc_norm_stderr\": 0.026329813341946243\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961464,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961464\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3954248366013072,\n \"acc_stderr\": 0.027996723180631438,\n\
\ \"acc_norm\": 0.3954248366013072,\n \"acc_norm_stderr\": 0.027996723180631438\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.36012861736334406,\n\
\ \"acc_stderr\": 0.027264297599804015,\n \"acc_norm\": 0.36012861736334406,\n\
\ \"acc_norm_stderr\": 0.027264297599804015\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.027513747284379424,\n\
\ \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.027513747284379424\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.31560283687943264,\n \"acc_stderr\": 0.02772498944950931,\n \
\ \"acc_norm\": 0.31560283687943264,\n \"acc_norm_stderr\": 0.02772498944950931\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.29921773142112124,\n\
\ \"acc_stderr\": 0.01169537463069603,\n \"acc_norm\": 0.29921773142112124,\n\
\ \"acc_norm_stderr\": 0.01169537463069603\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3897058823529412,\n \"acc_stderr\": 0.029624663581159696,\n\
\ \"acc_norm\": 0.3897058823529412,\n \"acc_norm_stderr\": 0.029624663581159696\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3349673202614379,\n \"acc_stderr\": 0.019094228167000325,\n \
\ \"acc_norm\": 0.3349673202614379,\n \"acc_norm_stderr\": 0.019094228167000325\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.37272727272727274,\n\
\ \"acc_stderr\": 0.04631381319425463,\n \"acc_norm\": 0.37272727272727274,\n\
\ \"acc_norm_stderr\": 0.04631381319425463\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4204081632653061,\n \"acc_stderr\": 0.03160106993449604,\n\
\ \"acc_norm\": 0.4204081632653061,\n \"acc_norm_stderr\": 0.03160106993449604\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.472636815920398,\n\
\ \"acc_stderr\": 0.035302355173346824,\n \"acc_norm\": 0.472636815920398,\n\
\ \"acc_norm_stderr\": 0.035302355173346824\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n\
\ \"acc_stderr\": 0.03726214354322415,\n \"acc_norm\": 0.35542168674698793,\n\
\ \"acc_norm_stderr\": 0.03726214354322415\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.32748538011695905,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.32748538011695905,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.45373679597767685,\n\
\ \"mc2_stderr\": 0.015753224924844992\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5422257300710339,\n \"acc_stderr\": 0.014002284504422435\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.21046560402684564,\n \
\ \"em_stderr\": 0.004174608410380015,\n \"f1\": 0.267364723154363,\n \
\ \"f1_stderr\": 0.004242093940617827\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.015163002274450341,\n \"acc_stderr\": 0.0033660229497263225\n\
\ }\n}\n```"
repo_url: https://huggingface.co/NurtureAI/Orca-2-7B-16k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|arc:challenge|25_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|drop|3_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|gsm8k|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hellaswag|10_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|winogrande|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-25T21-39-02.599324.parquet'
- config_name: results
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- results_2023-11-25T21-39-02.599324.parquet
- split: latest
path:
- results_2023-11-25T21-39-02.599324.parquet
---
# Dataset Card for Evaluation run of NurtureAI/Orca-2-7B-16k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NurtureAI/Orca-2-7B-16k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NurtureAI/Orca-2-7B-16k](https://huggingface.co/NurtureAI/Orca-2-7B-16k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NurtureAI__Orca-2-7B-16k_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-25T21:39:02.599324](https://huggingface.co/datasets/open-llm-leaderboard/details_NurtureAI__Orca-2-7B-16k_public/blob/main/results_2023-11-25T21-39-02.599324.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.36746546712957223,
"acc_stderr": 0.033751277531008754,
"acc_norm": 0.3738175555586316,
"acc_norm_stderr": 0.03459812342976094,
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.45373679597767685,
"mc2_stderr": 0.015753224924844992,
"em": 0.21046560402684564,
"em_stderr": 0.004174608410380015,
"f1": 0.267364723154363,
"f1_stderr": 0.004242093940617827
},
"harness|arc:challenge|25": {
"acc": 0.4735494880546075,
"acc_stderr": 0.014590931358120174,
"acc_norm": 0.5059726962457338,
"acc_norm_stderr": 0.014610348300255795
},
"harness|hellaswag|10": {
"acc": 0.47410874327823144,
"acc_stderr": 0.004983087049281742,
"acc_norm": 0.6389165504879506,
"acc_norm_stderr": 0.00479333052565621
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.0403356566784832,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.0403356566784832
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.39622641509433965,
"acc_stderr": 0.030102793781791194,
"acc_norm": 0.39622641509433965,
"acc_norm_stderr": 0.030102793781791194
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3819444444444444,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.3819444444444444,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.02951319662553935,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.02951319662553935
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948368,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948368
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.037649508797906066,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.037649508797906066
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4,
"acc_stderr": 0.02786932057166463,
"acc_norm": 0.4,
"acc_norm_stderr": 0.02786932057166463
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.038881769216741004,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.038881769216741004
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4292929292929293,
"acc_stderr": 0.03526552724601198,
"acc_norm": 0.4292929292929293,
"acc_norm_stderr": 0.03526552724601198
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5284974093264249,
"acc_stderr": 0.03602573571288441,
"acc_norm": 0.5284974093264249,
"acc_norm_stderr": 0.03602573571288441
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.024035489676335065,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.024035489676335065
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275794,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275794
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3067226890756303,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.3067226890756303,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.036030385453603826,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.036030385453603826
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5229357798165137,
"acc_stderr": 0.0214147570581755,
"acc_norm": 0.5229357798165137,
"acc_norm_stderr": 0.0214147570581755
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03099866630456052,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03099866630456052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.03492406104163613,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.03492406104163613
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6033755274261603,
"acc_stderr": 0.03184399873811226,
"acc_norm": 0.6033755274261603,
"acc_norm_stderr": 0.03184399873811226
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4304932735426009,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.4304932735426009,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.45038167938931295,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.45038167938931295,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4049586776859504,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.4049586776859504,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04750077341199986,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04750077341199986
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3803680981595092,
"acc_stderr": 0.03814269893261837,
"acc_norm": 0.3803680981595092,
"acc_norm_stderr": 0.03814269893261837
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.36893203883495146,
"acc_stderr": 0.047776151811567386,
"acc_norm": 0.36893203883495146,
"acc_norm_stderr": 0.047776151811567386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.43162393162393164,
"acc_stderr": 0.0324483553531149,
"acc_norm": 0.43162393162393164,
"acc_norm_stderr": 0.0324483553531149
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.40485312899106,
"acc_stderr": 0.017553246467720256,
"acc_norm": 0.40485312899106,
"acc_norm_stderr": 0.017553246467720256
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3959537572254335,
"acc_stderr": 0.026329813341946243,
"acc_norm": 0.3959537572254335,
"acc_norm_stderr": 0.026329813341946243
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961464,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3954248366013072,
"acc_stderr": 0.027996723180631438,
"acc_norm": 0.3954248366013072,
"acc_norm_stderr": 0.027996723180631438
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.36012861736334406,
"acc_stderr": 0.027264297599804015,
"acc_norm": 0.36012861736334406,
"acc_norm_stderr": 0.027264297599804015
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.027513747284379424,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.027513747284379424
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.31560283687943264,
"acc_stderr": 0.02772498944950931,
"acc_norm": 0.31560283687943264,
"acc_norm_stderr": 0.02772498944950931
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.29921773142112124,
"acc_stderr": 0.01169537463069603,
"acc_norm": 0.29921773142112124,
"acc_norm_stderr": 0.01169537463069603
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3897058823529412,
"acc_stderr": 0.029624663581159696,
"acc_norm": 0.3897058823529412,
"acc_norm_stderr": 0.029624663581159696
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3349673202614379,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.3349673202614379,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.37272727272727274,
"acc_stderr": 0.04631381319425463,
"acc_norm": 0.37272727272727274,
"acc_norm_stderr": 0.04631381319425463
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4204081632653061,
"acc_stderr": 0.03160106993449604,
"acc_norm": 0.4204081632653061,
"acc_norm_stderr": 0.03160106993449604
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.472636815920398,
"acc_stderr": 0.035302355173346824,
"acc_norm": 0.472636815920398,
"acc_norm_stderr": 0.035302355173346824
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322415,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322415
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.45373679597767685,
"mc2_stderr": 0.015753224924844992
},
"harness|winogrande|5": {
"acc": 0.5422257300710339,
"acc_stderr": 0.014002284504422435
},
"harness|drop|3": {
"em": 0.21046560402684564,
"em_stderr": 0.004174608410380015,
"f1": 0.267364723154363,
"f1_stderr": 0.004242093940617827
},
"harness|gsm8k|5": {
"acc": 0.015163002274450341,
"acc_stderr": 0.0033660229497263225
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7076535224914551,
-0.8884876370429993,
0.21383579075336456,
0.21327486634254456,
-0.16560819745063782,
-0.07279124110937119,
0.028583496809005737,
-0.2779676616191864,
0.587836503982544,
-0.06676704436540604,
-0.48849764466285706,
-0.6924371719360352,
-0.42478519678115845,
0.23139260709... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Gabriel1322/jutalo | Gabriel1322 | 2023-11-25T21:53:16Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T21:53:16Z | 2023-11-25T21:51:13.000Z | 2023-11-25T21:51:13 | ---
license: openrail
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
potaot/splat | potaot | 2023-11-25T22:20:39Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T22:20:39Z | 2023-11-25T22:13:09.000Z | 2023-11-25T22:13:09 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_NurtureAI__openchat_3.5-16k_public | open-llm-leaderboard | 2023-11-25T22:24:30Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T22:24:30Z | 2023-11-25T22:23:43.000Z | 2023-11-25T22:23:43 | ---
pretty_name: Evaluation run of NurtureAI/openchat_3.5-16k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NurtureAI/openchat_3.5-16k](https://huggingface.co/NurtureAI/openchat_3.5-16k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NurtureAI__openchat_3.5-16k_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-25T22:20:43.061836](https://huggingface.co/datasets/open-llm-leaderboard/details_NurtureAI__openchat_3.5-16k_public/blob/main/results_2023-11-25T22-20-43.061836.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6150624189136383,\n\
\ \"acc_stderr\": 0.0326145578895764,\n \"acc_norm\": 0.6229469261918253,\n\
\ \"acc_norm_stderr\": 0.0333127688298104,\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.01602157061376854,\n \"mc2\": 0.43468174693453937,\n\
\ \"mc2_stderr\": 0.014850723705548515,\n \"em\": 0.0017827181208053692,\n\
\ \"em_stderr\": 0.00043200973460388745,\n \"f1\": 0.06930893456375835,\n\
\ \"f1_stderr\": 0.0014539755752351418\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5853242320819113,\n \"acc_stderr\": 0.014397070564409174,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104296\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6290579565823541,\n\
\ \"acc_stderr\": 0.004820697457420415,\n \"acc_norm\": 0.8357896833300139,\n\
\ \"acc_norm_stderr\": 0.0036970918376320757\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.03266204299064678,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.03266204299064678\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424648,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424648\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268552,\n \"\
acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268552\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\"\
: 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124495,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124495\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062153,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062153\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n\
\ \"acc_stderr\": 0.029102254389674082,\n \"acc_norm\": 0.7794117647058824,\n\
\ \"acc_norm_stderr\": 0.029102254389674082\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n\
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822914,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822914\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794089,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794089\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757435,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757435\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3865921787709497,\n\
\ \"acc_stderr\": 0.016286674879101026,\n \"acc_norm\": 0.3865921787709497,\n\
\ \"acc_norm_stderr\": 0.016286674879101026\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.02508947852376513,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.02508947852376513\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.02970045324729147,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.02970045324729147\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n\
\ \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n\
\ \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n\
\ \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6437908496732027,\n \"acc_stderr\": 0.0193733324207245,\n \
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.0193733324207245\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.02796267760476892,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.02796267760476892\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.01602157061376854,\n \"mc2\": 0.43468174693453937,\n\
\ \"mc2_stderr\": 0.014850723705548515\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8011049723756906,\n \"acc_stderr\": 0.011218629972515316\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0017827181208053692,\n \
\ \"em_stderr\": 0.00043200973460388745,\n \"f1\": 0.06930893456375835,\n\
\ \"f1_stderr\": 0.0014539755752351418\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.21834723275208492,\n \"acc_stderr\": 0.011379497266738047\n\
\ }\n}\n```"
repo_url: https://huggingface.co/NurtureAI/openchat_3.5-16k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|arc:challenge|25_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|drop|3_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|gsm8k|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hellaswag|10_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T22-20-43.061836.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-25T22-20-43.061836.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- '**/details_harness|winogrande|5_2023-11-25T22-20-43.061836.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-25T22-20-43.061836.parquet'
- config_name: results
data_files:
- split: 2023_11_25T22_20_43.061836
path:
- results_2023-11-25T22-20-43.061836.parquet
- split: latest
path:
- results_2023-11-25T22-20-43.061836.parquet
---
# Dataset Card for Evaluation run of NurtureAI/openchat_3.5-16k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NurtureAI/openchat_3.5-16k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NurtureAI/openchat_3.5-16k](https://huggingface.co/NurtureAI/openchat_3.5-16k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NurtureAI__openchat_3.5-16k_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-25T22:20:43.061836](https://huggingface.co/datasets/open-llm-leaderboard/details_NurtureAI__openchat_3.5-16k_public/blob/main/results_2023-11-25T22-20-43.061836.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6150624189136383,
"acc_stderr": 0.0326145578895764,
"acc_norm": 0.6229469261918253,
"acc_norm_stderr": 0.0333127688298104,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.01602157061376854,
"mc2": 0.43468174693453937,
"mc2_stderr": 0.014850723705548515,
"em": 0.0017827181208053692,
"em_stderr": 0.00043200973460388745,
"f1": 0.06930893456375835,
"f1_stderr": 0.0014539755752351418
},
"harness|arc:challenge|25": {
"acc": 0.5853242320819113,
"acc_stderr": 0.014397070564409174,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104296
},
"harness|hellaswag|10": {
"acc": 0.6290579565823541,
"acc_stderr": 0.004820697457420415,
"acc_norm": 0.8357896833300139,
"acc_norm_stderr": 0.0036970918376320757
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424648,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424648
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268552,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062153,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062153
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822914,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822914
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794089,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794089
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489122,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489122
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757435,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757435
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3865921787709497,
"acc_stderr": 0.016286674879101026,
"acc_norm": 0.3865921787709497,
"acc_norm_stderr": 0.016286674879101026
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.02508947852376513,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.02508947852376513
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.02970045324729147,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.02970045324729147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.0193733324207245,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.0193733324207245
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.02796267760476892,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.02796267760476892
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.01602157061376854,
"mc2": 0.43468174693453937,
"mc2_stderr": 0.014850723705548515
},
"harness|winogrande|5": {
"acc": 0.8011049723756906,
"acc_stderr": 0.011218629972515316
},
"harness|drop|3": {
"em": 0.0017827181208053692,
"em_stderr": 0.00043200973460388745,
"f1": 0.06930893456375835,
"f1_stderr": 0.0014539755752351418
},
"harness|gsm8k|5": {
"acc": 0.21834723275208492,
"acc_stderr": 0.011379497266738047
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7114163637161255,
-0.8952910900115967,
0.23034705221652985,
0.24868737161159515,
-0.15623080730438232,
-0.06456933915615082,
-0.0035735375713557005,
-0.23871926963329315,
0.5934535264968872,
-0.0699675902724266,
-0.48554423451423645,
-0.694831132888794,
-0.40953728556632996,
0.225997164... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
sbgs/mental-health-dataset-mistral-7b | sbgs | 2023-11-25T22:31:25Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T22:31:25Z | 2023-11-25T22:28:22.000Z | 2023-11-25T22:28:22 | The original dataset is from Amod/mental_health_counseling_conversations and has been modified to be used for training Mistral 7B. | [
-0.4914844036102295,
-0.5515197515487671,
0.1435990333557129,
0.045700833201408386,
-0.24369703233242035,
-0.19911843538284302,
-0.12659446895122528,
-0.5221384763717651,
0.6686198711395264,
1.0019991397857666,
-0.958907961845398,
-0.4276122748851776,
-0.3448325991630554,
0.134504005312919... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Gabriel1322/austin | Gabriel1322 | 2023-11-25T22:48:45Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T22:48:45Z | 2023-11-25T22:33:47.000Z | 2023-11-25T22:33:47 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
yerkekz/mini-platypus | yerkekz | 2023-11-25T22:36:52Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T22:36:52Z | 2023-11-25T22:36:51.000Z | 2023-11-25T22:36:51 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245924
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
pavitemple/Accident-multiple-labels | pavitemple | 2023-11-25T22:53:14Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T22:53:14Z | 2023-11-25T22:43:13.000Z | 2023-11-25T22:43:13 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
colab-account/res | colab-account | 2023-11-25T22:49:35Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T22:49:35Z | 2023-11-25T22:46:21.000Z | 2023-11-25T22:46:21 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
samuli/finnish-theses | samuli | 2023-11-25T23:02:21Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T23:02:21Z | 2023-11-25T22:59:54.000Z | 2023-11-25T22:59:54 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Deni016/jorge | Deni016 | 2023-11-25T23:02:57Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T23:02:57Z | 2023-11-25T23:02:24.000Z | 2023-11-25T23:02:24 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
hlapp/ubergraph | hlapp | 2023-11-25T23:29:34Z | 0 | 0 | null | [
"license:bsd-3-clause",
"region:us"
] | 2023-11-25T23:29:34Z | 2023-11-25T23:14:00.000Z | 2023-11-25T23:14:00 | ---
license: bsd-3-clause
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
portafolio/prueba30sg | portafolio | 2023-11-25T23:15:54Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T23:15:54Z | 2023-11-25T23:15:03.000Z | 2023-11-25T23:15:03 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
patrickocal/gov-report-kgs | patrickocal | 2023-11-26T00:58:25Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-26T00:58:25Z | 2023-11-25T23:15:16.000Z | 2023-11-25T23:15:16 | ---
license: mit
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Tamazight-NLP/AmaWar | Tamazight-NLP | 2023-11-26T00:06:33Z | 0 | 0 | null | [
"task_categories:translation",
"task_categories:text2text-generation",
"size_categories:1K<n<10K",
"language:ber",
"language:ar",
"region:us"
] | 2023-11-26T00:06:33Z | 2023-11-25T23:19:50.000Z | 2023-11-25T23:19:50 | ---
task_categories:
- translation
- text2text-generation
language:
- ber
- ar
pretty_name: Amawal Warayni
size_categories:
- 1K<n<10K
---
# Amawal Warayni
Bitext scraped from the online [AmaWar](https://amawalwarayni.com/) dictionary of the Tamazight dialect of Ait Warain spoken in northeastern Morocco.
Contains sentences, stories, and poems in Tamazight along with their translations into Modern Standard Arabic.
Big thanks to Dr. Noureddine Amhaoui for his amazing work.
# Citation
```
نور الدين أمهاوي. (2021). معجم محوسب لمعاني الأسماء والأفعال الأمازيغية الوارينية أمازيغي-عربي.
تاريخ الاسترداد 15 11، 2023، من https://amawalwarayni.com/
```
| [
-0.6648110747337341,
-0.6475744247436523,
0.15187734365463257,
0.6855517625808716,
-0.5033351182937622,
0.014953751116991043,
-0.03272826224565506,
-0.7418371438980103,
0.9314279556274414,
0.6012329459190369,
-0.6018611788749695,
-0.34105417132377625,
-0.5541611313819885,
0.192379027605056... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
portafolio/03prueba | portafolio | 2023-11-25T23:21:49Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T23:21:49Z | 2023-11-25T23:20:36.000Z | 2023-11-25T23:20:36 | Entry not found | [
-0.32276472449302673,
-0.22568407654762268,
0.8622258901596069,
0.4346148371696472,
-0.5282984972000122,
0.7012965679168701,
0.7915717363357544,
0.07618629932403564,
0.7746022939682007,
0.2563222646713257,
-0.785281777381897,
-0.22573848068714142,
-0.9104482531547546,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Pablao0948/Austin_Mahone | Pablao0948 | 2023-11-25T23:36:19Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T23:36:19Z | 2023-11-25T23:33:21.000Z | 2023-11-25T23:33:21 | ---
license: openrail
---
| [
-0.12853392958641052,
-0.18616779148578644,
0.6529127955436707,
0.49436280131340027,
-0.19319361448287964,
0.23607419431209564,
0.36072003841400146,
0.050563063472509384,
0.579365611076355,
0.7400140762329102,
-0.6508104205131531,
-0.23783954977989197,
-0.7102249264717102,
-0.0478260256350... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Pablao0948/Austin_Mahone2 | Pablao0948 | 2023-11-25T23:41:23Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-25T23:41:23Z | 2023-11-25T23:40:46.000Z | 2023-11-25T23:40:46 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
MakimasFeet/bubmoni | MakimasFeet | 2023-11-25T23:54:32Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T23:54:32Z | 2023-11-25T23:53:47.000Z | 2023-11-25T23:53:47 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
IdolAI/ILLM-Basic-0.1 | IdolAI | 2023-11-25T23:58:14Z | 0 | 0 | null | [
"region:us"
] | 2023-11-25T23:58:14Z | 2023-11-25T23:58:14.000Z | 2023-11-25T23:58:14 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Deni016/iaratinho | Deni016 | 2023-11-26T00:21:21Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-26T00:21:21Z | 2023-11-26T00:20:44.000Z | 2023-11-26T00:20:44 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Gabriel1322/lucas | Gabriel1322 | 2023-11-26T01:13:44Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-26T01:13:44Z | 2023-11-26T01:13:23.000Z | 2023-11-26T01:13:23 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
OpenGVLab/SA-Med2D-20M | OpenGVLab | 2023-11-26T01:24:54Z | 0 | 0 | null | [
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2023-11-26T01:24:54Z | 2023-11-26T01:24:54.000Z | 2023-11-26T01:24:54 | ---
license: cc-by-nc-sa-4.0
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
typeof/GRIT-top-50k | typeof | 2023-11-26T01:41:10Z | 0 | 1 | null | [
"region:us"
] | 2023-11-26T01:41:10Z | 2023-11-26T01:38:56.000Z | 2023-11-26T01:38:56 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
kelen0102/Enrico_Pucci | kelen0102 | 2023-11-26T01:51:08Z | 0 | 0 | null | [
"license:openrail",
"region:us"
] | 2023-11-26T01:51:08Z | 2023-11-26T01:50:43.000Z | 2023-11-26T01:50:43 | ---
license: openrail
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Bluebomber182/Ariel-From-The-Little-Mermaid | Bluebomber182 | 2023-11-26T02:40:07Z | 0 | 0 | null | [
"license:unknown",
"region:us"
] | 2023-11-26T02:40:07Z | 2023-11-26T02:38:10.000Z | 2023-11-26T02:38:10 | ---
license: unknown
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
nlewins/cebuano-dictionary-words-with-audio | nlewins | 2023-11-27T23:26:48Z | 0 | 0 | null | [
"region:us"
] | 2023-11-27T23:26:48Z | 2023-11-26T02:45:58.000Z | 2023-11-26T02:45:58 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: en
dtype: string
- name: audio_ceb
dtype:
audio:
sampling_rate: 16000
- name: ceb
dtype: string
splits:
- name: train
num_bytes: 1705043074.3144617
num_examples: 12267
- name: test
num_bytes: 212551327.91861483
num_examples: 1534
- name: valid
num_bytes: 214294752.11492345
num_examples: 1533
download_size: 2127478373
dataset_size: 2131889154.348
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Soaresuz/freddy | Soaresuz | 2023-11-26T02:49:22Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T02:49:22Z | 2023-11-26T02:49:01.000Z | 2023-11-26T02:49:01 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
olp0qlo/lung-cancer-dataset | olp0qlo | 2023-11-26T02:51:50Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T02:51:50Z | 2023-11-26T02:51:49.000Z | 2023-11-26T02:51:49 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Wanfq/distillchat_v1_mixture | Wanfq | 2023-11-26T12:37:55Z | 0 | 1 | null | [
"size_categories:10K<n<100K",
"language:en",
"license:odc-by",
"distillchat",
"SFT",
"region:us"
] | 2023-11-26T12:37:55Z | 2023-11-26T03:03:24.000Z | 2023-11-26T03:03:24 | ---
license: odc-by
language:
- en
pretty_name: DistillChat V1 Mixture
size_categories:
- 10K<n<100K
tags:
- distillchat
- SFT
---
# Dataset Card for DistillChat V1 Mixture
*Note the [ODC-BY license](https://opendatacommons.org/licenses/by/1-0/), indicating that different licenses apply to subsets of the data. This means that some portions of the dataset are non-commercial. We present the mixture as a research artifact.*
The dataset consists of a mix of :
## General Ability
* [sharegpt_gpt4](https://huggingface.co/datasets/Wanfq/sharegpt_gpt4): All 6.21k examples.
* [pure_dove](https://huggingface.co/datasets/Wanfq/pure_dove): All 3.86k examples.
* [verified_camel](https://huggingface.co/datasets/Wanfq/verified_camel): All 0.127k examples.
* [lesswrong_amplify_instruct](https://huggingface.co/datasets/Wanfq/lesswrong_amplify_instruct): All 0.663k examples.
* [orca_best](https://huggingface.co/datasets/Wanfq/orca_best): Sampled 10k examples from 329k examples.
* [oasst_top1](https://huggingface.co/datasets/Wanfq/oasst_top1): Sampled 5k examples from 12.9k examples.
* [airoboros](https://huggingface.co/datasets/Wanfq/airoboros): Sampled 10k examples from 42.7k examples.
* [wizardlm](https://huggingface.co/datasets/Wanfq/wizardlm): Sampled 10k examples from 154k examples.
* [no_robots](https://huggingface.co/datasets/Wanfq/no_robots): All 9.5k examples.
* [ultrachat_200k](https://huggingface.co/datasets/Wanfq/ultrachat_200k): Sampled 10k examples from 208k examples.
## Coding Ability
* [glaive_code_assistant](https://huggingface.co/datasets/Wanfq/glaive_code_assistant): Sampled 5k examples from 215k examples.
* [python_code](https://huggingface.co/datasets/Wanfq/python_code): Sampled 5k examples from 22.6k examples.
* [wizardcoder](https://huggingface.co/datasets/Wanfq/wizardcoder): Sampled 5k examples from 111k examples.
## Mathematics Ability
* [metamathqa](https://huggingface.co/datasets/Wanfq/metamathqa): Sampled 5k examples from 395k examples.
* [mathinstruct](https://huggingface.co/datasets/Wanfq/mathinstruct): Sampled 5k examples from 142k examples.
**Model Family:** All the models and the dataset are found in the [DistillChat collection](https://huggingface.co/collections/Wanfq/distillchat-6562c1fe4e74b2075a0e617e).
The length distribution of the dataset can be seen below:
* distillchat_v1_clean_split_2048_filter_wrong
| Statistics | Value |
|:---|:---:|
#sequence | 85.53 K |
#tokens | 54.01 M |
avg. turns | 1.49 |
avg. prompt length | 109.28 |
avg. response length | 315.89 |
L0 - 1024 | 68388 |
L1024 - 2048 | 16560 |
L2048 - 4096 | 535 |
L4096 - 8192 | 42 |
L8192 - 16384 | 2 |
L16384 - 32768 | 0 |
* distillchat_v1_clean_split_8192_filter_wrong
| Statistics | Value |
|:---|:---:|
#sequence | 82.10 K |
#tokens | 56.13 M |
avg. turns | 1.54 |
avg. prompt length | 123.87 |
avg. response length | 318.64 |
L0 - 1024 | 67583 |
L1024 - 2048 | 10469 |
L2048 - 4096 | 3165 |
L4096 - 8192 | 878 |
L8192 - 16384 | 2 |
L16384 - 32768 | 0 |
### License
We are releasing this dataset under the terms of [ODC-BY](https://opendatacommons.org/licenses/by/1-0/). By using this, you are also bound by the [Common Crawl terms of use](https://commoncrawl.org/terms-of-use/) in respect of the content contained in the dataset. | [
-0.550764262676239,
-0.7647420763969421,
-0.08264017105102539,
0.3275876045227051,
-0.36302828788757324,
-0.03887395188212395,
0.047940775752067566,
-0.30674606561660767,
0.43585649132728577,
0.4590851962566376,
-0.7542157769203186,
-0.7516927123069763,
-0.5905312299728394,
0.0536366514861... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Wanfq/airoboros | Wanfq | 2023-11-26T04:10:15Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T04:10:15Z | 2023-11-26T03:11:32.000Z | 2023-11-26T03:11:32 | https://huggingface.co/datasets/jondurbin/airoboros-2.2.1
features: general, single-turn, chat
length: 42.7k | [
-0.7728633880615234,
-0.32110559940338135,
-0.263423889875412,
0.54703289270401,
-0.5794287919998169,
-0.3329944610595703,
-0.14306427538394928,
-0.5856236815452576,
0.9960456490516663,
0.8127456307411194,
-0.7789598703384399,
-0.49846383929252625,
-0.5435715913772583,
-0.41697853803634644... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Wanfq/glaive_code_assistant | Wanfq | 2023-11-26T04:11:01Z | 0 | 1 | null | [
"region:us"
] | 2023-11-26T04:11:01Z | 2023-11-26T03:12:52.000Z | 2023-11-26T03:12:52 | https://huggingface.co/datasets/glaiveai/glaive-code-assistant-v2
features: coding, single-turn, task
length: 215k | [
-0.17956219613552094,
-0.25098156929016113,
0.20841234922409058,
0.5942820906639099,
-0.3768722712993622,
-0.26604828238487244,
0.17439502477645874,
-0.7017853856086731,
0.6123172640800476,
0.8074896335601807,
-0.8669028878211975,
-0.4434605836868286,
-0.6224880814552307,
-0.45471829175949... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Wanfq/lesswrong_amplify_instruct | Wanfq | 2023-11-26T04:15:47Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T04:15:47Z | 2023-11-26T03:16:26.000Z | 2023-11-26T03:16:26 | https://huggingface.co/datasets/LDJnr/LessWrong-Amplify-Instruct
features: general, multi-turn, chat
length: 0.663k | [
-0.8236497044563293,
-0.5906859636306763,
-0.04995492845773697,
0.47336167097091675,
-0.2942351996898651,
-0.32613128423690796,
-0.30556991696357727,
-0.9037326574325562,
1.0951021909713745,
0.5463232398033142,
-1.003371238708496,
-0.49063098430633545,
-0.5242478847503662,
-0.2820939719676... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Wanfq/mathinstruct | Wanfq | 2023-11-26T04:17:27Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T04:17:27Z | 2023-11-26T03:17:25.000Z | 2023-11-26T03:17:25 | https://huggingface.co/datasets/TIGER-Lab/MathInstruct
features: mathematics, single-turn, task
preserve keys: 'data/CoT/math50k_camel.json', 'data/CoT/college_math.json', 'data/CoT/TheoremQA.json', 'data/CoT/number_comparison.json', 'data/CoT/aqua_rat.json'
length: 142k | [
-0.6498897671699524,
-0.24294815957546234,
0.10164395719766617,
0.1361384242773056,
-0.08205065876245499,
0.422930508852005,
0.06995628774166107,
0.1328771710395813,
0.4897308349609375,
0.8220650553703308,
-0.8752791285514832,
-0.6512474417686462,
-0.34019845724105835,
0.26539936661720276,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Wanfq/metamathqa | Wanfq | 2023-11-26T04:23:54Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T04:23:54Z | 2023-11-26T03:20:26.000Z | 2023-11-26T03:20:26 | https://huggingface.co/datasets/meta-math/MetaMathQA
features: mathematics, single-turn, task
length: 395k | [
-0.7193005084991455,
-0.24171651899814606,
0.26285460591316223,
0.536820113658905,
-0.30938541889190674,
-0.20122593641281128,
0.0014329850673675537,
-0.11812952160835266,
0.8190850615501404,
0.8647712469100952,
-1.3740217685699463,
-0.7115592956542969,
-0.476959228515625,
-0.1599376499652... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Wanfq/oasst_top1 | Wanfq | 2023-11-26T04:24:59Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T04:24:59Z | 2023-11-26T03:23:22.000Z | 2023-11-26T03:23:22 | https://huggingface.co/datasets/OpenAssistant/oasst_top1_2023-08-25
features: general, multi-turn, chat
length: 12.9k | [
-0.576758086681366,
-0.5796973705291748,
0.15765142440795898,
0.25583013892173767,
-0.3646334111690521,
-0.1360776275396347,
-0.060320883989334106,
-0.6209611296653748,
0.8728370070457458,
0.4874255955219269,
-1.0948004722595215,
-0.6078594326972961,
-0.7037975788116455,
-0.597859561443328... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Wanfq/orca_best | Wanfq | 2023-11-26T04:25:38Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T04:25:38Z | 2023-11-26T03:38:02.000Z | 2023-11-26T03:38:02 | https://huggingface.co/datasets/shahules786/orca-best
features: general, single-turn, task
length: 329k | [
-0.4829564392566681,
-0.26599007844924927,
0.08516902476549149,
0.3623642921447754,
-0.5350289344787598,
-0.38550862669944763,
-0.09666569530963898,
-0.7253906726837158,
0.8134092092514038,
0.6393813490867615,
-0.8560808897018433,
-0.8305949568748474,
-0.4558970332145691,
-0.05654247477650... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Wanfq/pure_dove | Wanfq | 2023-11-26T04:26:14Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T04:26:14Z | 2023-11-26T03:47:44.000Z | 2023-11-26T03:47:44 | https://huggingface.co/datasets/LDJnr/Pure-Dove
features: general multi-turn, chat
length: 3.86k | [
-0.46476662158966064,
-0.373748779296875,
0.18289725482463837,
0.5030084848403931,
-0.4766019284725189,
-0.3288598656654358,
-0.08034512400627136,
-0.8079390525817871,
0.7239599823951721,
0.8289696574211121,
-0.9503947496414185,
-0.5399627685546875,
-0.7109346985816956,
-0.3900435864925384... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Wanfq/python_code | Wanfq | 2023-11-26T04:26:45Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T04:26:45Z | 2023-11-26T03:49:43.000Z | 2023-11-26T03:49:43 | https://huggingface.co/datasets/ajibawa-2023/Python-Code-23k-ShareGPT
features: coding, single-turn, task
length: 22.6k | [
-0.5965889096260071,
-0.3572882115840912,
-0.008333017118275166,
1.0435960292816162,
-0.21169476211071014,
-0.2946277856826782,
-0.25289806723594666,
-0.5703467130661011,
0.499563992023468,
0.5515373349189758,
-0.8863022327423096,
-0.4452015161514282,
-0.7399843335151672,
-0.12285318970680... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Wanfq/sharegpt_gpt4 | Wanfq | 2023-11-26T04:27:21Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T04:27:21Z | 2023-11-26T03:50:33.000Z | 2023-11-26T03:50:33 | https://huggingface.co/datasets/shibing624/sharegpt_gpt4
features: general, multi-turn, chat
length: 6.21k | [
-0.7661207914352417,
-0.474355012178421,
0.28298383951187134,
0.6057591438293457,
-0.4543899893760681,
-0.17819058895111084,
-0.1668243259191513,
-0.6604242920875549,
0.7093040943145752,
0.42864444851875305,
-0.8396412134170532,
-0.5149651169776917,
-0.817318320274353,
-0.4227231740951538,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Wanfq/ultrachat_200k | Wanfq | 2023-11-26T04:27:53Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T04:27:53Z | 2023-11-26T03:51:58.000Z | 2023-11-26T03:51:58 | https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k
features: general, multi-turn, chat
length: 208k | [
-0.6940896511077881,
-0.3237180709838867,
0.22665011882781982,
0.5516184568405151,
-0.32344892621040344,
0.023510919883847237,
-0.09286694973707199,
-0.5249915719032288,
0.8445971608161926,
0.6767345070838928,
-0.95902419090271,
-0.4636740982532501,
-0.21947124600410461,
-0.350118726491928... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Wanfq/verified_camel | Wanfq | 2023-11-26T04:28:24Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T04:28:24Z | 2023-11-26T03:58:23.000Z | 2023-11-26T03:58:23 | https://huggingface.co/datasets/LDJnr/Verified-Camel
features: general, single-turn, task
length: 0.127k | [
-0.41538405418395996,
-0.41136378049850464,
-0.14571166038513184,
0.47297564148902893,
-0.6512787938117981,
-0.44320765137672424,
-0.03299666568636894,
-0.7944552302360535,
0.8160797953605652,
0.884371280670166,
-1.1167582273483276,
-0.8504940867424011,
-0.5326624512672424,
-0.178435787558... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Wanfq/wizardcoder | Wanfq | 2023-11-26T04:28:50Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T04:28:50Z | 2023-11-26T03:58:44.000Z | 2023-11-26T03:58:44 | https://huggingface.co/datasets/theblackcat102/evol-codealpaca-v1
features: coding, single-turn, task
length: 111k | [
-0.5028278231620789,
-0.3427988588809967,
0.1671612411737442,
0.6213457584381104,
-0.4232291281223297,
-0.034163251519203186,
0.09890071302652359,
-0.6052746176719666,
1.072594404220581,
0.783395528793335,
-0.9555721879005432,
-0.6463679075241089,
-0.49396783113479614,
-0.03785379976034164... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
Wanfq/wizardlm | Wanfq | 2023-11-26T04:29:27Z | 0 | 1 | null | [
"region:us"
] | 2023-11-26T04:29:27Z | 2023-11-26T04:01:23.000Z | 2023-11-26T04:01:23 | https://huggingface.co/datasets/ehartford/WizardLM_evol_instruct_V2_196k_unfiltered_merged_split
features: general, multi-turn, chat
length: 154k | [
-0.7516478300094604,
-0.4229121804237366,
-0.10108449310064316,
0.18056897819042206,
-0.45732998847961426,
0.005943366792052984,
-0.11202516406774521,
-0.7106345891952515,
0.6016382575035095,
0.7735662460327148,
-0.8981188535690308,
-0.22461238503456116,
-0.44766420125961304,
-0.1980338394... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
yuyijiong/Multi-doc-QA-CommonCrawl | yuyijiong | 2023-11-26T04:59:58Z | 0 | 0 | null | [
"size_categories:10K<n<100K",
"language:en",
"license:cc-by-nc-4.0",
"region:us"
] | 2023-11-26T04:59:58Z | 2023-11-26T04:04:20.000Z | 2023-11-26T04:04:20 | ---
license: cc-by-nc-4.0
size_categories:
- 10K<n<100K
language:
- en
---
* English multi document Q&A data created using RedPajamaCommonCrawl data as reference text
* Each sample contains <font
color=red> one reference document, 199 irrelevant documents, and a Q-A pair based on the reference document</font>. It can be used to train models to extract the target information from a large number of documents.
* 以RedPajamaCommonCrawl数据为参考文本,制作的英文多文档问答数据
*
每个样本包含 <font color=red> 一个参考文档、199个无关文档、一个基于参考文档的问答对</font>。可以训练模型从大量文档中抽取关键信息的能力。
* dataset size: about 10k | [
-0.4923892915248871,
-0.8856362700462341,
0.11995550990104675,
0.4263015389442444,
-0.40535932779312134,
-0.4111218750476837,
0.11669070273637772,
-0.5006135106086731,
0.6839128732681274,
0.7419762015342712,
-0.9592090249061584,
-0.5294612646102905,
-0.6588477492332458,
0.3171824514865875,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
praneetha0812/nlp_and_non_nlp_terms | praneetha0812 | 2023-11-26T04:18:52Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T04:18:52Z | 2023-11-26T04:18:52.000Z | 2023-11-26T04:18:52 | Entry not found | [
-0.3227648138999939,
-0.22568459808826447,
0.8622260093688965,
0.43461498618125916,
-0.5282989144325256,
0.701296329498291,
0.7915719151496887,
0.07618649303913116,
0.7746025323867798,
0.2563220262527466,
-0.7852813601493835,
-0.22573833167552948,
-0.9104480743408203,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
pykeio/librivox-tracks | pykeio | 2023-11-26T04:25:32Z | 0 | 0 | null | [
"task_categories:text-to-speech",
"task_categories:automatic-speech-recognition",
"size_categories:100K<n<1M",
"license:cc-by-4.0",
"region:us"
] | 2023-11-26T04:25:32Z | 2023-11-26T04:21:02.000Z | 2023-11-26T04:21:02 | ---
license: cc-by-4.0
task_categories:
- text-to-speech
- automatic-speech-recognition
pretty_name: LibriVox Tracks
size_categories:
- 100K<n<1M
---
A dataset of all audio files uploaded to LibriVox before 26th September 2023. | [
-0.44700586795806885,
-0.043626319617033005,
0.5573174357414246,
0.3173424303531647,
0.00018169765826314688,
-0.5106782913208008,
0.2781316041946411,
-0.3183607757091522,
0.6528980731964111,
1.262939453125,
-1.159000039100647,
-0.3851369619369507,
-0.25629934668540955,
-0.21536259353160858... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
NInjaQuarrior/DiseasterData | NInjaQuarrior | 2023-11-26T04:37:46Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T04:37:46Z | 2023-11-26T04:36:18.000Z | 2023-11-26T04:36:18 | Entry not found | [
-0.3227648138999939,
-0.22568459808826447,
0.8622260093688965,
0.43461498618125916,
-0.5282989144325256,
0.701296329498291,
0.7915719151496887,
0.07618649303913116,
0.7746025323867798,
0.2563220262527466,
-0.7852813601493835,
-0.22573833167552948,
-0.9104480743408203,
0.5715669393539429,
... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
dread1900/DMZ-V2 | dread1900 | 2023-11-26T04:49:19Z | 0 | 0 | null | [
"license:unknown",
"region:us"
] | 2023-11-26T04:49:19Z | 2023-11-26T04:44:58.000Z | 2023-11-26T04:44:58 | ---
license: unknown
---
| [
-0.12853379547595978,
-0.18616773188114166,
0.6529127955436707,
0.4943625330924988,
-0.19319316744804382,
0.23607458174228668,
0.36071985960006714,
0.05056329071521759,
0.5793651938438416,
0.740013837814331,
-0.6508100628852844,
-0.23783975839614868,
-0.710224986076355,
-0.0478257611393928... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
2ch/textures | 2ch | 2023-11-26T06:05:28Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T06:05:28Z | 2023-11-26T05:05:12.000Z | 2023-11-26T05:05:12 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
yvonne1123/training_dynamic | yvonne1123 | 2023-11-26T08:54:27Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T08:54:27Z | 2023-11-26T07:12:43.000Z | 2023-11-26T07:12:43 | # Dataset Dir
```
data(input path)
│ index.json(optional, for nosiy model)
│ new_labels.json(optional, for nosiy model)
│ old_labels.json(optional, for nosiy model)
|
└─── sprites
│ │ 0.png
│ │ 1.png
│ │ ...
│
└───Model
│ │ model.py
│ │
│ └───Epoch_1
│ │ index.json
│ │ subject_model.pth
| | (train_data.npy) [after preprocess]
| | (test_data.npy) [after preprocess]
| | (border_centers.npy) [after preprocess]
| | (vismodel.pth) [after trained]
| | (embedding.npy) [after visulization]
| | (scale.npy) [after visulization]
| | (bgimg.png) [after visulization]
│ └───Epoch_2
| | ...
│
└───Training_data
| │ training_dataset_data.pth
| │ training_dataset_label.pth
│
└───Testing_data
│ │ testing_dataset_data.pth
│ │ testing_dataset_label.pth
└───config.json
```
| [
-0.5621963143348694,
-0.6245537996292114,
0.3348081707954407,
-0.0014186599291861057,
-0.3078603148460388,
-0.10604508966207504,
0.015796514227986336,
0.09123747795820236,
0.2939671576023102,
0.6643550395965576,
-0.6668426990509033,
-0.6863853931427002,
-0.6194592714309692,
0.0286633484065... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
nintorac/midi_etl | nintorac | 2023-11-26T07:20:36Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T07:20:36Z | 2023-11-26T07:16:48.000Z | 2023-11-26T07:16:48 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
wrandhawa/test | wrandhawa | 2023-11-26T07:29:19Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T07:29:19Z | 2023-11-26T07:29:16.000Z | 2023-11-26T07:29:16 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622264862060547,
0.43461528420448303,
-0.52829909324646,
0.7012971639633179,
0.7915720343589783,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104477167129517,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
venketh/SlimPajama-62B | venketh | 2023-11-28T16:37:49Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | 2023-11-28T16:37:49Z | 2023-11-26T07:39:25.000Z | 2023-11-26T07:39:25 | ---
license: apache-2.0
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
xinyu1205/recognize-anything-dataset | xinyu1205 | 2023-11-28T09:03:12Z | 0 | 0 | null | [
"task_categories:zero-shot-classification",
"size_categories:10M<n<100M",
"language:en",
"license:apache-2.0",
"image recognition",
"region:us"
] | 2023-11-28T09:03:12Z | 2023-11-26T07:39:53.000Z | 2023-11-26T07:39:53 | ---
license: apache-2.0
language:
- en
tags:
- image recognition
task_categories:
- zero-shot-classification
size_categories:
- 10M<n<100M
---
# Recognize Anything Dataset Card
## Dataset details
**Dataset type:**
These annotation files come from the Recognize Anything Model (RAM). RAM propose an automatic data engine to generate substantial image tags from image-text pairs.
**Dataset date:**
Recognize Anything Dataset was collected in April 2023, by an automatic data engine proposed by RAM.
**Paper or resources for more information:**
https://github.com/xinyu1205/recognize-anything
**Where to send questions or comments about the model:**
https://github.com/xinyu1205/recognize-anything/issues
## Intended use
**Primary intended uses:**
The primary use of Recognize Anything is research on fundamental image recognition models.
**Primary intended users:**
The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence.
| [
-0.5655826330184937,
-0.3657921552658081,
0.34027615189552307,
-0.080605648458004,
-0.3633290231227875,
-0.46920594573020935,
0.054302990436553955,
-0.8728167414665222,
0.3446009159088135,
0.35475921630859375,
-0.3336917459964752,
-0.7325538992881775,
-0.591494083404541,
-0.069247022271156... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
miittnnss/testdataset | miittnnss | 2023-11-26T08:39:42Z | 0 | 0 | null | [
"pytorch",
"region:us"
] | 2023-11-26T08:39:42Z | 2023-11-26T08:12:20.000Z | 2023-11-26T08:12:20 | ---
pretty_name: TestDataset
tags:
- pytorch
--- | [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
mawadalla/scientific-figures-captions-context | mawadalla | 2023-11-27T21:47:52Z | 0 | 1 | null | [
"task_categories:visual-question-answering",
"task_categories:document-question-answering",
"size_categories:100K<n<1M",
"language:en",
"region:us"
] | 2023-11-27T21:47:52Z | 2023-11-26T08:50:40.000Z | 2023-11-26T08:50:40 | ---
pretty_name: Scientific Figures, Captions and Context
task_categories:
- visual-question-answering
- document-question-answering
language:
- en
size_categories:
- 100K<n<1M
configs:
- config_name: Data
data_files: merged.json
---
# Dataset Card for Scientific Figures, Captions, and Context
A novel vision-language dataset of scientific figures taken directly from research papers.
We scraped approximately ~150k papers, with about ~690k figures total. We extracted each figure's caption and label from the paper. In addition, we searched through each paper to find references of each figure and included the surrounding text as 'context' for this figure.
All figures were taken from arXiv research papers.
<figure>
<img width="500" src="example1.png">
<figcaption>Figure 5: Comparisons between our multifidelity learning paradigm and single low-fidelity (all GPT-3.5) annotation on four domain-specific tasks given the same total 1000 annotation budget. Note that the samples for all GPT-3.5 are drawn based on the uncertainty score.</figcaption>
</figure>
<figure>
<img width="500" src="example2.png">
<figcaption>Figure 3: Problem representation visualization by T- SNE. Our model with A&D improves the problem rep- resentation learning, which groups analogical problems close and separates non-analogical problems.</figurecaption>
</figure>
### Usage
The `merged.json` file is a mapping between the figure's filename as stored in the repository and its caption, label, and context.
To use, you must extract the parts located under dataset/figures/ and keep the raw images in the same directory so that they match the image_filename fields.
The images are named in the format ```<paper id>-<figure name>``` where paper id is the id given by arXiv and figure name is the name of the figure as given in the raw format of each paper.
# Contributors
Yousef Gomaa (@yousefg-codes) and Mohamed Awadalla (@mawadalla)
## Dataset Description
- **Paper:** coming soon
### Dataset Summary
This dataset includes ~690,000 figures from ~150,000 scientific papers taken from arXiv papers. Each object in the json file is a single research paper with a list of figures each with their caption and surrounding context.
| Category | Count |
|:-----------|--------:|
| Figure | 690883 |
| Paper | 152504 |
### Data Instances
An example of an object in the `merged.json` file:
```json
{
[
{
'image_filename': 'dataset/figures/example.png' (or .eps or .pdf or other type),
'label': 'fig_example',
'caption': 'an example caption for this figure',
'context': ['example context where this figure was referenced', 'up to 600 characters']
},
...
]
}
```
## Dataset Creation
We utilized the bulk access of arXiv's papers.
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Citation Information
coming soon | [
-0.4253195524215698,
-0.32268229126930237,
0.3268579840660095,
-0.11700178682804108,
-0.3616808354854584,
0.03996768221259117,
0.01254544872790575,
-0.5137414336204529,
0.23076282441616058,
0.6043570041656494,
-0.3411506414413452,
-0.6847469210624695,
-0.6932668089866638,
0.343296498060226... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
open-llm-leaderboard/details_01-ai__Yi-34B-Chat_public | open-llm-leaderboard | 2023-11-26T08:59:24Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T08:59:24Z | 2023-11-26T08:58:33.000Z | 2023-11-26T08:58:33 | ---
pretty_name: Evaluation run of 01-ai/Yi-34B-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [01-ai/Yi-34B-Chat](https://huggingface.co/01-ai/Yi-34B-Chat) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_01-ai__Yi-34B-Chat_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-26T08:55:32.839765](https://huggingface.co/datasets/open-llm-leaderboard/details_01-ai__Yi-34B-Chat_public/blob/main/results_2023-11-26T08-55-32.839765.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7370378132198422,\n\
\ \"acc_stderr\": 0.028721593699452555,\n \"acc_norm\": 0.7485754915381607,\n\
\ \"acc_norm_stderr\": 0.029297069476795286,\n \"mc1\": 0.390452876376989,\n\
\ \"mc1_stderr\": 0.017078230743431448,\n \"mc2\": 0.5541159010785334,\n\
\ \"mc2_stderr\": 0.015532900561162527,\n \"em\": 0.005138422818791947,\n\
\ \"em_stderr\": 0.0007322104102794241,\n \"f1\": 0.08032508389261797,\n\
\ \"f1_stderr\": 0.001571649833831937\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.014063260279882417,\n\
\ \"acc_norm\": 0.6510238907849829,\n \"acc_norm_stderr\": 0.013928933461382501\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6536546504680343,\n\
\ \"acc_stderr\": 0.004748324319714273,\n \"acc_norm\": 0.8407687711611233,\n\
\ \"acc_norm_stderr\": 0.003651437958333959\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.7185185185185186,\n \"acc_stderr\": 0.038850042458002526,\n\
\ \"acc_norm\": 0.7185185185185186,\n \"acc_norm_stderr\": 0.038850042458002526\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8618421052631579,\n\
\ \"acc_stderr\": 0.028081042939576552,\n \"acc_norm\": 0.8618421052631579,\n\
\ \"acc_norm_stderr\": 0.028081042939576552\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.02461829819586651,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.02461829819586651\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.8472222222222222,\n \"acc_stderr\": 0.030085743248565666,\n\
\ \"acc_norm\": 0.8472222222222222,\n \"acc_norm_stderr\": 0.030085743248565666\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.035149425512674394,\n\
\ \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.035149425512674394\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n\
\ \"acc_stderr\": 0.04951218252396264,\n \"acc_norm\": 0.45098039215686275,\n\
\ \"acc_norm_stderr\": 0.04951218252396264\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7659574468085106,\n\
\ \"acc_stderr\": 0.02767845257821239,\n \"acc_norm\": 0.7659574468085106,\n\
\ \"acc_norm_stderr\": 0.02767845257821239\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.543859649122807,\n \"acc_stderr\": 0.046854730419077895,\n\
\ \"acc_norm\": 0.543859649122807,\n \"acc_norm_stderr\": 0.046854730419077895\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.7931034482758621,\n \"acc_stderr\": 0.03375672449560553,\n \"\
acc_norm\": 0.7931034482758621,\n \"acc_norm_stderr\": 0.03375672449560553\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6455026455026455,\n \"acc_stderr\": 0.024636830602841997,\n \"\
acc_norm\": 0.6455026455026455,\n \"acc_norm_stderr\": 0.024636830602841997\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.867741935483871,\n \"acc_stderr\": 0.019272015434846457,\n \"\
acc_norm\": 0.867741935483871,\n \"acc_norm_stderr\": 0.019272015434846457\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6157635467980296,\n \"acc_stderr\": 0.0342239856565755,\n \"acc_norm\"\
: 0.6157635467980296,\n \"acc_norm_stderr\": 0.0342239856565755\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.898989898989899,\n \"acc_stderr\": 0.021469735576055343,\n \"\
acc_norm\": 0.898989898989899,\n \"acc_norm_stderr\": 0.021469735576055343\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.014385432857476442,\n\
\ \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.014385432857476442\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7846153846153846,\n \"acc_stderr\": 0.020843034557462878,\n\
\ \"acc_norm\": 0.7846153846153846,\n \"acc_norm_stderr\": 0.020843034557462878\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083015,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083015\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8361344537815126,\n \"acc_stderr\": 0.024044054940440488,\n\
\ \"acc_norm\": 0.8361344537815126,\n \"acc_norm_stderr\": 0.024044054940440488\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5033112582781457,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.5033112582781457,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9100917431192661,\n \"acc_stderr\": 0.012264304540230444,\n \"\
acc_norm\": 0.9100917431192661,\n \"acc_norm_stderr\": 0.012264304540230444\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6388888888888888,\n \"acc_stderr\": 0.032757734861009996,\n \"\
acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.032757734861009996\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9068627450980392,\n \"acc_stderr\": 0.020397853969426998,\n \"\
acc_norm\": 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969426998\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065508,\n \
\ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065508\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n\
\ \"acc_stderr\": 0.02624113299640726,\n \"acc_norm\": 0.8116591928251121,\n\
\ \"acc_norm_stderr\": 0.02624113299640726\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8931297709923665,\n \"acc_stderr\": 0.027096548624883733,\n\
\ \"acc_norm\": 0.8931297709923665,\n \"acc_norm_stderr\": 0.027096548624883733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540627,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540627\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n\
\ \"acc_stderr\": 0.02923927267563274,\n \"acc_norm\": 0.8981481481481481,\n\
\ \"acc_norm_stderr\": 0.02923927267563274\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.852760736196319,\n \"acc_stderr\": 0.027839915278339657,\n\
\ \"acc_norm\": 0.852760736196319,\n \"acc_norm_stderr\": 0.027839915278339657\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n\
\ \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n\
\ \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.03176683948640405,\n\
\ \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.03176683948640405\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n\
\ \"acc_stderr\": 0.01872430174194166,\n \"acc_norm\": 0.9102564102564102,\n\
\ \"acc_norm_stderr\": 0.01872430174194166\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9016602809706258,\n\
\ \"acc_stderr\": 0.010648356301876336,\n \"acc_norm\": 0.9016602809706258,\n\
\ \"acc_norm_stderr\": 0.010648356301876336\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8034682080924855,\n \"acc_stderr\": 0.02139396140436385,\n\
\ \"acc_norm\": 0.8034682080924855,\n \"acc_norm_stderr\": 0.02139396140436385\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7072625698324022,\n\
\ \"acc_stderr\": 0.015218109544410175,\n \"acc_norm\": 0.7072625698324022,\n\
\ \"acc_norm_stderr\": 0.015218109544410175\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8300653594771242,\n \"acc_stderr\": 0.02150538312123137,\n\
\ \"acc_norm\": 0.8300653594771242,\n \"acc_norm_stderr\": 0.02150538312123137\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8006430868167203,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.8006430868167203,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8734567901234568,\n \"acc_stderr\": 0.01849860055879091,\n\
\ \"acc_norm\": 0.8734567901234568,\n \"acc_norm_stderr\": 0.01849860055879091\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6170212765957447,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5508474576271186,\n\
\ \"acc_stderr\": 0.012704030518851476,\n \"acc_norm\": 0.5508474576271186,\n\
\ \"acc_norm_stderr\": 0.012704030518851476\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7867647058823529,\n \"acc_stderr\": 0.024880971512294257,\n\
\ \"acc_norm\": 0.7867647058823529,\n \"acc_norm_stderr\": 0.024880971512294257\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8284313725490197,\n \"acc_stderr\": 0.01525199316349162,\n \
\ \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.01525199316349162\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n\
\ \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.390452876376989,\n\
\ \"mc1_stderr\": 0.017078230743431448,\n \"mc2\": 0.5541159010785334,\n\
\ \"mc2_stderr\": 0.015532900561162527\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.797947908445146,\n \"acc_stderr\": 0.011285013754047443\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.005138422818791947,\n \
\ \"em_stderr\": 0.0007322104102794241,\n \"f1\": 0.08032508389261797,\n\
\ \"f1_stderr\": 0.001571649833831937\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.19787717968157695,\n \"acc_stderr\": 0.010973889601756317\n\
\ }\n}\n```"
repo_url: https://huggingface.co/01-ai/Yi-34B-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|arc:challenge|25_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|drop|3_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|gsm8k|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hellaswag|10_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-26T08-55-32.839765.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-26T08-55-32.839765.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- '**/details_harness|winogrande|5_2023-11-26T08-55-32.839765.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-26T08-55-32.839765.parquet'
- config_name: results
data_files:
- split: 2023_11_26T08_55_32.839765
path:
- results_2023-11-26T08-55-32.839765.parquet
- split: latest
path:
- results_2023-11-26T08-55-32.839765.parquet
---
# Dataset Card for Evaluation run of 01-ai/Yi-34B-Chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/01-ai/Yi-34B-Chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [01-ai/Yi-34B-Chat](https://huggingface.co/01-ai/Yi-34B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_01-ai__Yi-34B-Chat_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-26T08:55:32.839765](https://huggingface.co/datasets/open-llm-leaderboard/details_01-ai__Yi-34B-Chat_public/blob/main/results_2023-11-26T08-55-32.839765.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7370378132198422,
"acc_stderr": 0.028721593699452555,
"acc_norm": 0.7485754915381607,
"acc_norm_stderr": 0.029297069476795286,
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431448,
"mc2": 0.5541159010785334,
"mc2_stderr": 0.015532900561162527,
"em": 0.005138422818791947,
"em_stderr": 0.0007322104102794241,
"f1": 0.08032508389261797,
"f1_stderr": 0.001571649833831937
},
"harness|arc:challenge|25": {
"acc": 0.6356655290102389,
"acc_stderr": 0.014063260279882417,
"acc_norm": 0.6510238907849829,
"acc_norm_stderr": 0.013928933461382501
},
"harness|hellaswag|10": {
"acc": 0.6536546504680343,
"acc_stderr": 0.004748324319714273,
"acc_norm": 0.8407687711611233,
"acc_norm_stderr": 0.003651437958333959
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8618421052631579,
"acc_stderr": 0.028081042939576552,
"acc_norm": 0.8618421052631579,
"acc_norm_stderr": 0.028081042939576552
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8,
"acc_stderr": 0.02461829819586651,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02461829819586651
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565666,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396264,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396264
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7659574468085106,
"acc_stderr": 0.02767845257821239,
"acc_norm": 0.7659574468085106,
"acc_norm_stderr": 0.02767845257821239
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.543859649122807,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.543859649122807,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7931034482758621,
"acc_stderr": 0.03375672449560553,
"acc_norm": 0.7931034482758621,
"acc_norm_stderr": 0.03375672449560553
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6455026455026455,
"acc_stderr": 0.024636830602841997,
"acc_norm": 0.6455026455026455,
"acc_norm_stderr": 0.024636830602841997
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.867741935483871,
"acc_stderr": 0.019272015434846457,
"acc_norm": 0.867741935483871,
"acc_norm_stderr": 0.019272015434846457
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6157635467980296,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.6157635467980296,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.898989898989899,
"acc_stderr": 0.021469735576055343,
"acc_norm": 0.898989898989899,
"acc_norm_stderr": 0.021469735576055343
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.014385432857476442,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.014385432857476442
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7846153846153846,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.7846153846153846,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083015,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083015
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8361344537815126,
"acc_stderr": 0.024044054940440488,
"acc_norm": 0.8361344537815126,
"acc_norm_stderr": 0.024044054940440488
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5033112582781457,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.5033112582781457,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9100917431192661,
"acc_stderr": 0.012264304540230444,
"acc_norm": 0.9100917431192661,
"acc_norm_stderr": 0.012264304540230444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.032757734861009996,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.032757734861009996
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969426998,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969426998
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065508,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065508
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.02624113299640726,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.02624113299640726
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8931297709923665,
"acc_stderr": 0.027096548624883733,
"acc_norm": 0.8931297709923665,
"acc_norm_stderr": 0.027096548624883733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540627,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540627
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563274,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563274
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.852760736196319,
"acc_stderr": 0.027839915278339657,
"acc_norm": 0.852760736196319,
"acc_norm_stderr": 0.027839915278339657
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.03176683948640405,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.03176683948640405
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.01872430174194166,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.01872430174194166
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9016602809706258,
"acc_stderr": 0.010648356301876336,
"acc_norm": 0.9016602809706258,
"acc_norm_stderr": 0.010648356301876336
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8034682080924855,
"acc_stderr": 0.02139396140436385,
"acc_norm": 0.8034682080924855,
"acc_norm_stderr": 0.02139396140436385
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7072625698324022,
"acc_stderr": 0.015218109544410175,
"acc_norm": 0.7072625698324022,
"acc_norm_stderr": 0.015218109544410175
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.02150538312123137,
"acc_norm": 0.8300653594771242,
"acc_norm_stderr": 0.02150538312123137
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8006430868167203,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.8006430868167203,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8734567901234568,
"acc_stderr": 0.01849860055879091,
"acc_norm": 0.8734567901234568,
"acc_norm_stderr": 0.01849860055879091
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5508474576271186,
"acc_stderr": 0.012704030518851476,
"acc_norm": 0.5508474576271186,
"acc_norm_stderr": 0.012704030518851476
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7867647058823529,
"acc_stderr": 0.024880971512294257,
"acc_norm": 0.7867647058823529,
"acc_norm_stderr": 0.024880971512294257
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.01525199316349162,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.01525199316349162
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.390452876376989,
"mc1_stderr": 0.017078230743431448,
"mc2": 0.5541159010785334,
"mc2_stderr": 0.015532900561162527
},
"harness|winogrande|5": {
"acc": 0.797947908445146,
"acc_stderr": 0.011285013754047443
},
"harness|drop|3": {
"em": 0.005138422818791947,
"em_stderr": 0.0007322104102794241,
"f1": 0.08032508389261797,
"f1_stderr": 0.001571649833831937
},
"harness|gsm8k|5": {
"acc": 0.19787717968157695,
"acc_stderr": 0.010973889601756317
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] | [
-0.7253909111022949,
-0.892030656337738,
0.2625539302825928,
0.21223804354667664,
-0.15855704247951508,
-0.04450330510735512,
-0.0035154693759977818,
-0.22191132605075836,
0.5804341435432434,
-0.049615055322647095,
-0.508234441280365,
-0.7038512825965881,
-0.4336002767086029,
0.20278850197... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
shourya342/15rowdata | shourya342 | 2023-11-26T09:19:04Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T09:19:04Z | 2023-11-26T09:18:48.000Z | 2023-11-26T09:18:48 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
ThomasSorensen/aa | ThomasSorensen | 2023-11-26T09:30:49Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T09:30:49Z | 2023-11-26T09:19:53.000Z | 2023-11-26T09:19:53 | Entry not found | [
-0.3227645754814148,
-0.22568479180335999,
0.8622263669967651,
0.43461522459983826,
-0.52829909324646,
0.7012971639633179,
0.7915719747543335,
0.07618614286184311,
0.774603009223938,
0.2563217282295227,
-0.7852813005447388,
-0.22573819756507874,
-0.9104475975036621,
0.5715674161911011,
-... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
satware/yggdrasil | satware | 2023-11-26T09:52:04Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-26T09:52:04Z | 2023-11-26T09:52:04.000Z | 2023-11-26T09:52:04 | ---
license: mit
---
| [
-0.1285335123538971,
-0.1861683875322342,
0.6529128551483154,
0.49436232447624207,
-0.19319400191307068,
0.23607441782951355,
0.36072009801864624,
0.05056373029947281,
0.5793656706809998,
0.7400146722793579,
-0.650810182094574,
-0.23784008622169495,
-0.7102247476577759,
-0.0478255338966846... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
code-philia/mtpnet | code-philia | 2023-11-26T09:58:18Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-26T09:58:18Z | 2023-11-26T09:58:18.000Z | 2023-11-26T09:58:18 | ---
license: mit
---
| [
-0.1285339742898941,
-0.18616800010204315,
0.6529127359390259,
0.4943626821041107,
-0.1931934952735901,
0.2360742688179016,
0.360720157623291,
0.05056300014257431,
0.5793654322624207,
0.7400140166282654,
-0.6508105993270874,
-0.23783984780311584,
-0.7102248668670654,
-0.047826044261455536,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
jinghan23/DatasetofPEMCompostition | jinghan23 | 2023-11-26T10:12:50Z | 0 | 1 | null | [
"license:cc-by-nc-4.0",
"arxiv:2306.14870",
"region:us"
] | 2023-11-26T10:12:50Z | 2023-11-26T10:03:37.000Z | 2023-11-26T10:03:37 | ---
license: cc-by-nc-4.0
---
Training dataset for Alpaca-LoRA negation of [PEM composition](https://arxiv.org/abs/2306.14870).
Instructions for model evaluation on helpfulness and toxicity.
More concise interpretation can be found at [Git repo](https://github.com/hkust-nlp/PEM_composition).
**Toxic datasets abusement is dangerous for AI community, so this repo is gated and users have to request for approval.** | [
-0.2607758939266205,
-0.9076065421104431,
0.32849767804145813,
0.32889121770858765,
-0.37563225626945496,
-0.36967340111732483,
0.24963440001010895,
-0.5298319458961487,
0.14840167760849,
0.9467405080795288,
-0.7213025093078613,
-0.7540223598480225,
-0.6683013439178467,
-0.0465386509895324... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
nookbe/Buergerliches_Gesetzbuch_BGB | nookbe | 2023-11-26T11:44:58Z | 0 | 0 | null | [
"task_categories:text-classification",
"size_categories:1K<n<10K",
"language:de",
"license:mit",
"legal",
"region:us"
] | 2023-11-26T11:44:58Z | 2023-11-26T10:15:25.000Z | 2023-11-26T10:15:25 | ---
license: mit
task_categories:
- text-classification
language:
- de
tags:
- legal
pretty_name: BGB
size_categories:
- 1K<n<10K
---
# German BGB Law Dataset (Bürgerliches Gesetzbuch)
## Dataset Description
- **Date of Last Paragraph Update:** April 2023
- **Dataset Guarantee:** The dataset is provided "as is," and there is no guarantee for the correctness or completeness of the data.
### Dataset Summary
The BGB Law Dataset contains legal text from the German Civil Code (Bürgerliches Gesetzbuch - BGB). It focuses on the general principles of German civil law, and the dataset is designed for tasks related to legal text analysis.
## Dataset Structure
### Data Instances
A typical data point in the dataset comprises a legal paragraph and its corresponding text. For example:
```json
{
'paragraph': '§ 1 Beginn der Rechtsfähigkeit',
'text': 'Die Rechtsfähigkeit des Menschen beginnt mit der Vollendung der Geburt.'
}
``` | [
-0.18751271069049835,
-0.7529090046882629,
0.4489540457725525,
0.5216970443725586,
-0.5532823801040649,
-0.4227753281593323,
-0.13608451187610626,
-0.25689437985420227,
-0.06597208231687546,
0.7336905598640442,
-0.5004593133926392,
-1.022863745689392,
-0.4602125585079193,
0.003983428701758... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
hajar817/common_voice_13_0_fa_pseudo_labelled | hajar817 | 2023-11-26T10:19:12Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T10:19:12Z | 2023-11-26T10:19:12.000Z | 2023-11-26T10:19:12 | Entry not found | [
-0.32276487350463867,
-0.22568444907665253,
0.8622263073921204,
0.43461570143699646,
-0.5282988548278809,
0.7012969255447388,
0.7915717363357544,
0.07618642598390579,
0.7746027112007141,
0.25632190704345703,
-0.7852815389633179,
-0.22573848068714142,
-0.910447895526886,
0.5715675354003906,... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
MyRebRIc/japaa | MyRebRIc | 2023-11-26T10:23:47Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T10:23:47Z | 2023-11-26T10:23:18.000Z | 2023-11-26T10:23:18 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
wrrdhfj/BanGDream | wrrdhfj | 2023-11-26T15:11:55Z | 0 | 0 | null | [
"license:unknown",
"region:us"
] | 2023-11-26T15:11:55Z | 2023-11-26T10:41:25.000Z | 2023-11-26T10:41:25 | ---
license: unknown
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
MyRebRIc/ricardo | MyRebRIc | 2023-11-26T10:59:19Z | 0 | 0 | null | [
"region:us"
] | 2023-11-26T10:59:19Z | 2023-11-26T10:43:31.000Z | 2023-11-26T10:43:31 | Entry not found | [
-0.3227649927139282,
-0.225684255361557,
0.862226128578186,
0.43461498618125916,
-0.5282987952232361,
0.7012963891029358,
0.7915717363357544,
0.07618629932403564,
0.7746025919914246,
0.2563219666481018,
-0.7852816581726074,
-0.2257382869720459,
-0.9104480743408203,
0.5715669393539429,
-0... | null | null | null | null | null | null | null | null | null | null | null | null | null | |
washimneupane/stackCluster10 | washimneupane | 2023-11-26T10:43:37Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | 2023-11-26T10:43:37Z | 2023-11-26T10:43:37.000Z | 2023-11-26T10:43:37 | ---
license: mit
---
| [
-0.12853367626667023,
-0.18616794049739838,
0.6529126763343811,
0.4943627417087555,
-0.19319313764572144,
0.23607443273067474,
0.36071979999542236,
0.05056338757276535,
0.5793654322624207,
0.7400138974189758,
-0.6508103013038635,
-0.23783987760543823,
-0.710224986076355,
-0.047825977206230... | null | null | null | null | null | null | null | null | null | null | null | null | null |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.