datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
andersonbcdefg/misc_qa_pairs | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 855319505.3801266
num_examples: 1457478
download_size: 265348295
dataset_size: 855319505.3801266
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Skimm3r918/lovetogether | ---
license: creativeml-openrail-m
---
|
JONRFewf/my_golos | ---
license: mit
---
|
CyberHarem/kleine_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kleine (Fire Emblem)
This is the dataset of kleine (Fire Emblem), containing 29 images and their tags.
The core tags of this character are `blonde_hair, long_hair, blue_eyes, breasts, bangs, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 29 | 36.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kleine_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 29 | 23.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kleine_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 71 | 47.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kleine_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 29 | 34.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kleine_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 71 | 63.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kleine_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kleine_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 29 |  |  |  |  |  | 1girl, solo, armor, looking_at_viewer, skirt, thighhighs, fingerless_gloves, simple_background, bow_(weapon), elbow_gloves, holding, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | armor | looking_at_viewer | skirt | thighhighs | fingerless_gloves | simple_background | bow_(weapon) | elbow_gloves | holding | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:--------|:-------------|:--------------------|:--------------------|:---------------|:---------------|:----------|:-------------------|
| 0 | 29 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_ledjo__Gabriel-8x7B-Instruct-v0.1 | ---
pretty_name: Evaluation run of ledjo/Gabriel-8x7B-Instruct-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ledjo/Gabriel-8x7B-Instruct-v0.1](https://huggingface.co/ledjo/Gabriel-8x7B-Instruct-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ledjo__Gabriel-8x7B-Instruct-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-03T21:07:09.107244](https://huggingface.co/datasets/open-llm-leaderboard/details_ledjo__Gabriel-8x7B-Instruct-v0.1/blob/main/results_2024-04-03T21-07-09.107244.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.712936882118807,\n\
\ \"acc_stderr\": 0.030244450409503233,\n \"acc_norm\": 0.7170199333921503,\n\
\ \"acc_norm_stderr\": 0.030824508870998326,\n \"mc1\": 0.48959608323133413,\n\
\ \"mc1_stderr\": 0.017499711430249264,\n \"mc2\": 0.6327928100766638,\n\
\ \"mc2_stderr\": 0.015051345843456798\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6646757679180887,\n \"acc_stderr\": 0.013796182947785562,\n\
\ \"acc_norm\": 0.7013651877133106,\n \"acc_norm_stderr\": 0.013374078615068735\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6813383788090022,\n\
\ \"acc_stderr\": 0.004650052150094396,\n \"acc_norm\": 0.8752240589524,\n\
\ \"acc_norm_stderr\": 0.003297893047728374\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562429,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562429\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.032790004063100495,\n\
\ \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.032790004063100495\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7811320754716982,\n \"acc_stderr\": 0.02544786382510861,\n\
\ \"acc_norm\": 0.7811320754716982,\n \"acc_norm_stderr\": 0.02544786382510861\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7630057803468208,\n\
\ \"acc_stderr\": 0.032424147574830975,\n \"acc_norm\": 0.7630057803468208,\n\
\ \"acc_norm_stderr\": 0.032424147574830975\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6638297872340425,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.6638297872340425,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n\
\ \"acc_stderr\": 0.04598188057816542,\n \"acc_norm\": 0.6052631578947368,\n\
\ \"acc_norm_stderr\": 0.04598188057816542\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n\
\ \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130733,\n \"\
acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130733\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8516129032258064,\n \"acc_stderr\": 0.020222737554330378,\n \"\
acc_norm\": 0.8516129032258064,\n \"acc_norm_stderr\": 0.020222737554330378\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.625615763546798,\n \"acc_stderr\": 0.03405155380561952,\n \"acc_norm\"\
: 0.625615763546798,\n \"acc_norm_stderr\": 0.03405155380561952\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047709,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047709\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.01438543285747646,\n\
\ \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.01438543285747646\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.023234581088428494,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.023234581088428494\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n\
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8025210084033614,\n \"acc_stderr\": 0.02585916412205145,\n \
\ \"acc_norm\": 0.8025210084033614,\n \"acc_norm_stderr\": 0.02585916412205145\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944215,\n \"\
acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944215\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8770642201834863,\n \"acc_stderr\": 0.014078467983673374,\n \"\
acc_norm\": 0.8770642201834863,\n \"acc_norm_stderr\": 0.014078467983673374\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017016,\n \
\ \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017016\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7533632286995515,\n\
\ \"acc_stderr\": 0.028930413120910877,\n \"acc_norm\": 0.7533632286995515,\n\
\ \"acc_norm_stderr\": 0.028930413120910877\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462469,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934724,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934724\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5892857142857143,\n\
\ \"acc_stderr\": 0.04669510663875192,\n \"acc_norm\": 0.5892857142857143,\n\
\ \"acc_norm_stderr\": 0.04669510663875192\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.035865947385739734,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.035865947385739734\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n\
\ \"acc_stderr\": 0.017456987872436193,\n \"acc_norm\": 0.9230769230769231,\n\
\ \"acc_norm_stderr\": 0.017456987872436193\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8837803320561941,\n\
\ \"acc_stderr\": 0.011460632981922878,\n \"acc_norm\": 0.8837803320561941,\n\
\ \"acc_norm_stderr\": 0.011460632981922878\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.021855255263421795,\n\
\ \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.021855255263421795\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45251396648044695,\n\
\ \"acc_stderr\": 0.016646914804438775,\n \"acc_norm\": 0.45251396648044695,\n\
\ \"acc_norm_stderr\": 0.016646914804438775\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.826797385620915,\n \"acc_stderr\": 0.021668400256514266,\n\
\ \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.021668400256514266\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7942122186495176,\n\
\ \"acc_stderr\": 0.022961339906764244,\n \"acc_norm\": 0.7942122186495176,\n\
\ \"acc_norm_stderr\": 0.022961339906764244\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.021038517770157358,\n\
\ \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.021038517770157358\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.029634838473766002,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.029634838473766002\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.546284224250326,\n\
\ \"acc_stderr\": 0.01271540484127775,\n \"acc_norm\": 0.546284224250326,\n\
\ \"acc_norm_stderr\": 0.01271540484127775\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.024562204314142314,\n\
\ \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.024562204314142314\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7696078431372549,\n \"acc_stderr\": 0.01703522925803404,\n \
\ \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.01703522925803404\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7795918367346939,\n \"acc_stderr\": 0.026537045312145287,\n\
\ \"acc_norm\": 0.7795918367346939,\n \"acc_norm_stderr\": 0.026537045312145287\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101713,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101713\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.02464806896136615,\n\
\ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.02464806896136615\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48959608323133413,\n\
\ \"mc1_stderr\": 0.017499711430249264,\n \"mc2\": 0.6327928100766638,\n\
\ \"mc2_stderr\": 0.015051345843456798\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8097868981846882,\n \"acc_stderr\": 0.01103033579861744\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6057619408642911,\n \
\ \"acc_stderr\": 0.013460852357095656\n }\n}\n```"
repo_url: https://huggingface.co/ledjo/Gabriel-8x7B-Instruct-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|arc:challenge|25_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|gsm8k|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hellaswag|10_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T21-07-09.107244.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T21-07-09.107244.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- '**/details_harness|winogrande|5_2024-04-03T21-07-09.107244.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-03T21-07-09.107244.parquet'
- config_name: results
data_files:
- split: 2024_04_03T21_07_09.107244
path:
- results_2024-04-03T21-07-09.107244.parquet
- split: latest
path:
- results_2024-04-03T21-07-09.107244.parquet
---
# Dataset Card for Evaluation run of ledjo/Gabriel-8x7B-Instruct-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ledjo/Gabriel-8x7B-Instruct-v0.1](https://huggingface.co/ledjo/Gabriel-8x7B-Instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ledjo__Gabriel-8x7B-Instruct-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-03T21:07:09.107244](https://huggingface.co/datasets/open-llm-leaderboard/details_ledjo__Gabriel-8x7B-Instruct-v0.1/blob/main/results_2024-04-03T21-07-09.107244.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.712936882118807,
"acc_stderr": 0.030244450409503233,
"acc_norm": 0.7170199333921503,
"acc_norm_stderr": 0.030824508870998326,
"mc1": 0.48959608323133413,
"mc1_stderr": 0.017499711430249264,
"mc2": 0.6327928100766638,
"mc2_stderr": 0.015051345843456798
},
"harness|arc:challenge|25": {
"acc": 0.6646757679180887,
"acc_stderr": 0.013796182947785562,
"acc_norm": 0.7013651877133106,
"acc_norm_stderr": 0.013374078615068735
},
"harness|hellaswag|10": {
"acc": 0.6813383788090022,
"acc_stderr": 0.004650052150094396,
"acc_norm": 0.8752240589524,
"acc_norm_stderr": 0.003297893047728374
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562429,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562429
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.032790004063100495,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.032790004063100495
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7811320754716982,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.7811320754716982,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.032424147574830975,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.032424147574830975
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6638297872340425,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.6638297872340425,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.04598188057816542,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.04598188057816542
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47883597883597884,
"acc_stderr": 0.025728230952130733,
"acc_norm": 0.47883597883597884,
"acc_norm_stderr": 0.025728230952130733
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8516129032258064,
"acc_stderr": 0.020222737554330378,
"acc_norm": 0.8516129032258064,
"acc_norm_stderr": 0.020222737554330378
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.625615763546798,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.625615763546798,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.01438543285747646,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.01438543285747646
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7,
"acc_stderr": 0.023234581088428494,
"acc_norm": 0.7,
"acc_norm_stderr": 0.023234581088428494
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8025210084033614,
"acc_stderr": 0.02585916412205145,
"acc_norm": 0.8025210084033614,
"acc_norm_stderr": 0.02585916412205145
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944215,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944215
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8770642201834863,
"acc_stderr": 0.014078467983673374,
"acc_norm": 0.8770642201834863,
"acc_norm_stderr": 0.014078467983673374
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017016,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017016
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7533632286995515,
"acc_stderr": 0.028930413120910877,
"acc_norm": 0.7533632286995515,
"acc_norm_stderr": 0.028930413120910877
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462469,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934724,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934724
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5892857142857143,
"acc_stderr": 0.04669510663875192,
"acc_norm": 0.5892857142857143,
"acc_norm_stderr": 0.04669510663875192
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.035865947385739734,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.035865947385739734
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.017456987872436193,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.017456987872436193
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8837803320561941,
"acc_stderr": 0.011460632981922878,
"acc_norm": 0.8837803320561941,
"acc_norm_stderr": 0.011460632981922878
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.021855255263421795,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.021855255263421795
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45251396648044695,
"acc_stderr": 0.016646914804438775,
"acc_norm": 0.45251396648044695,
"acc_norm_stderr": 0.016646914804438775
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.826797385620915,
"acc_stderr": 0.021668400256514266,
"acc_norm": 0.826797385620915,
"acc_norm_stderr": 0.021668400256514266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7942122186495176,
"acc_stderr": 0.022961339906764244,
"acc_norm": 0.7942122186495176,
"acc_norm_stderr": 0.022961339906764244
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.021038517770157358,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.021038517770157358
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.029634838473766002,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.029634838473766002
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.546284224250326,
"acc_stderr": 0.01271540484127775,
"acc_norm": 0.546284224250326,
"acc_norm_stderr": 0.01271540484127775
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.024562204314142314,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.024562204314142314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.01703522925803404,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.01703522925803404
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7795918367346939,
"acc_stderr": 0.026537045312145287,
"acc_norm": 0.7795918367346939,
"acc_norm_stderr": 0.026537045312145287
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101713,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101713
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.02464806896136615,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.02464806896136615
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48959608323133413,
"mc1_stderr": 0.017499711430249264,
"mc2": 0.6327928100766638,
"mc2_stderr": 0.015051345843456798
},
"harness|winogrande|5": {
"acc": 0.8097868981846882,
"acc_stderr": 0.01103033579861744
},
"harness|gsm8k|5": {
"acc": 0.6057619408642911,
"acc_stderr": 0.013460852357095656
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_cola_null_prepositions | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 31875
num_examples: 435
- name: test
num_bytes: 30754
num_examples: 431
- name: train
num_bytes: 255285
num_examples: 3563
download_size: 151209
dataset_size: 317914
---
# Dataset Card for "MULTI_VALUE_cola_null_prepositions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Labagaite/fr-summarizer-dataset | ---
dataset_info:
features:
- name: fr-summarizer-dataset
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 13739369
num_examples: 1968
- name: validation
num_bytes: 2957786
num_examples: 440
download_size: 7646820
dataset_size: 16697155
configs:
- config_name: string
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
license: mit
task_categories:
- summarization
- text-generation
- text2text-generation
language:
- fr
tags:
- code
- summarizer
- dataset
- llm
- fr
pretty_name: fr-summarizer-dataset
size_categories:
- 1K<n<10K
---
# training data
- Dataset : [fr-summarizer-dataset](https://huggingface.co/datasets/Labagaite/fr-summarizer-dataset)
- Data-size : 7.65 MB
- train : 1.97k rows
- validation : 440 rows
- roles : user , assistant
- Format chatml "role": "role", "content": "content", "user": "user", "assistant": "assistant"
<br>
*French audio podcast transcription*
# Project details
[<img src="https://avatars.githubusercontent.com/u/116890814?v=4" width="100"/>](https://github.com/WillIsback/Report_Maker)
Fine-tuned on French audio podcast transcription data for summarization task. As a result, the model is able to summarize French audio podcast transcription data.
The model will be used for an AI application: [Report Maker](https://github.com/WillIsback/Report_Maker) wich is a powerful tool designed to automate the process of transcribing and summarizing meetings.
It leverages state-of-the-art machine learning models to provide detailed and accurate reports.
# Building the dataset:
The dataset was built with openai GPT3.5-Turbo generativ response to a summarize task. Being already competent in that task, in french and having a big context window.
The max_new_token_length was set to 1024 to fit smaller model training.
Really small model as tiny llama need to truncate wich will affect the context and the quality result of the training.
Check the [prompt](https://github.com/WillIsback/Report_Maker/blob/main/Utils/prompts.py) structure made to perform for 3 summarize task :
- Summarize (simple)
- Map reduce summarize
- Refine summarize
Check also the [code](https://github.com/WillIsback/Report_Maker/blob/main/Utils/summarize_dataset_builder.py) used for generate the response for this dataset
# Formating data for [unsloth](https://github.com/unslothai/unsloth)/[Summarize](https://github.com/WillIsback/LLM_Summarizer_Trainer) training:
```Python
from datasets import load_dataset, Dataset
import pandas as pd
from unsloth.chat_templates import get_chat_template
class ChatTemplate():
def __init__(self, tokenizer):
self.tokenizer = tokenizer
def formating_messages(self,example):
user_chat = {"role": example["user"]["role"], "content": example["user"]["content"]}
assistant_chat = {"role": example["assistant"]["role"], "content": example["assistant"]["content"]}
return {"messages": [user_chat, assistant_chat]}
def formatting_prompts_func(self,examples):
convos = examples["messages"]
texts = [self.tokenizer.apply_chat_template(convo, tokenize = False, add_generation_prompt = False) for convo in convos]
return { "text" : texts, }
def load_data(self):
self.tokenizer = get_chat_template(
self.tokenizer,
chat_template = "chatml", # Supports zephyr, chatml, mistral, llama, alpaca, vicuna, vicuna_old, unsloth
mapping = {"role": "role", "content": "content", "user": "user", "assistant": "assistant"}, # ShareGPT style
map_eos_token = True, # Maps <|im_end|> to </s> instead
)
dataset_train = load_dataset("Labagaite/fr-summarizer-dataset", split = "train")
dataset_val = load_dataset("Labagaite/fr-summarizer-dataset", split = "validation")
# Group the data
grouped_data_train = [{"user": dataset_train[i], "assistant": dataset_train[i+1]} for i in range(0, len(dataset_train), 2)]
grouped_data_val = [{"user": dataset_val[i], "assistant": dataset_val[i+1]} for i in range(0, len(dataset_val), 2)]
# Convert the list of dictionaries to a DataFrame
df_train = pd.DataFrame(grouped_data_train)
df_val = pd.DataFrame(grouped_data_val)
# Create a new Dataset object
dataset_train = Dataset.from_pandas(df_train)
dataset_val = Dataset.from_pandas(df_val)
dataset_train = dataset_train.map(self.formating_messages, batched = False)
dataset_train = dataset_train.map(self.formatting_prompts_func, batched = True)
dataset_val = dataset_val.map(self.formating_messages, batched = False)
dataset_val = dataset_val.map(self.formatting_prompts_func, batched = True)
return dataset_train, dataset_val
```
|
NathanRoll/TalkBank_CA_Croatian | ---
dataset_info:
features:
- name: audio
sequence: float32
- name: __index_level_0__
dtype: string
splits:
- name: train
num_bytes: 5827530910
num_examples: 135
download_size: 5834902692
dataset_size: 5827530910
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "TalkBank_CA_Croatian"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
codymlewis/HAR | ---
dataset_info:
features:
- name: features
sequence: float32
length: 561
- name: labels
dtype:
class_label:
names:
'0': WALKING
'1': WALKING_UPSTAIRS
'2': WALKING_DOWNSTAIRS
'3': SITTING
'4': STANDING
'5': LAYING
'6': STAND_TO_SIT
'7': SIT_TO_STAND
'8': SIT_TO_LIE
'9': LIE_TO_SIT
'10': STAND_TO_LIE
'11': LIE_TO_STAND
- name: subject id
dtype: uint8
splits:
- name: train
num_bytes: 17499051
num_examples: 7767
- name: test
num_bytes: 7123986
num_examples: 3162
download_size: 79596192
dataset_size: 24623037
license: cc-by-4.0
pretty_name: HAR
size_categories:
- n<1K
---
# Dataset Card for HAR
A tabular dataset which poses the task of prediction human activity based on smartphone sensor signal (accelerometer and gyroscope).
## Dataset Details
### Dataset Description
*Summary from https://archive.ics.uci.edu/dataset/240/human+activity+recognition+using+smartphones:*
The experiments were carried out with a group of 30 volunteers within an age bracket of 19-48 years. They performed a protocol of activities composed of six basic activities: three static postures (standing, sitting, lying) and three dynamic activities (walking, walking downstairs and walking upstairs). The experiment also included postural transitions that occurred between the static postures. These are: stand-to-sit, sit-to-stand, sit-to-lie, lie-to-sit, stand-to-lie, and lie-to-stand. All the participants were wearing a smartphone (Samsung Galaxy S II) on the waist during the experiment execution. We captured 3-axial linear acceleration and 3-axial angular velocity at a constant rate of 50Hz using the embedded accelerometer and gyroscope of the device. The experiments were video-recorded to label the data manually. The obtained dataset was randomly partitioned into two sets, where 70% of the volunteers was selected for generating the training data and 30% the test data.
The sensor signals (accelerometer and gyroscope) were pre-processed by applying noise filters and then sampled in fixed-width sliding windows of 2.56 sec and 50% overlap (128 readings/window). The sensor acceleration signal, which has gravitational and body motion components, was separated using a Butterworth low-pass filter into body acceleration and gravity. The gravitational force is assumed to have only low frequency components, therefore a filter with 0.3 Hz cutoff frequency was used. From each window, a vector of 561 features was obtained by calculating variables from the time and frequency domain. See 'features_info.txt' for more details.
This dataset is an updated version of the UCI Human Activity Recognition Using smartphones Dataset that can be found at: https://archive.ics.uci.edu/ml/datasets/Human+Activity+Recognition+Using+Smartphones
This version provides the original raw inertial signals from the smartphone sensors, instead of the ones pre-processed into windows which were provided in version 1. This change was done in order to be able to make online tests with the raw data. Moreover, the activity labels were updated in order to include postural transitions that were not part of the previous version of the dataset.
- **Curated by:** Reyes-Ortiz, Jorge, Anguita, Davide, Ghio, Alessandro, Oneto, Luca, and Parra, Xavier
- **License:** This dataset is licensed under a [Creative Commons Attribution 4.0 International (CC BY 4.0)](https://creativecommons.org/licenses/by/4.0/legalcode) license.
### Dataset Sources
- **Repository:** http://archive.ics.uci.edu/dataset/341/smartphone+based+recognition+of+human+activities+and+postural+transitions
- **Paper:** https://www.sciencedirect.com/science/article/abs/pii/S0925231215010930
- **Experiment Demo:** http://www.youtube.com/watch?v=XOEN9W05_4A
## Citation
**BibTeX:**
@misc{misc_smartphone-based_recognition_of_human_activities_and_postural_transitions_341,
author = {Reyes-Ortiz,Jorge, Anguita,Davide, Oneto,Luca, and Parra,Xavier},
title = {{Smartphone-Based Recognition of Human Activities and Postural Transitions}},
year = {2015},
howpublished = {UCI Machine Learning Repository},
note = {{DOI}: https://doi.org/10.24432/C54G7M}
}
**APA:**
Reyes-Ortiz, Jorge, Anguita, Davide, Oneto, Luca, and Parra, Xavier. (2015). Smartphone-Based Recognition of Human Activities and Postural Transitions. UCI Machine Learning Repository. https://doi.org/10.24432/C54G7M. |
autoevaluate/autoeval-eval-squad_v2-squad_v2-878283-2493776900 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: Jiqing/bert-large-uncased-whole-word-masking-finetuned-squad-finetuned-squad
metrics: ['precision', 'recall']
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: Jiqing/bert-large-uncased-whole-word-masking-finetuned-squad-finetuned-squad
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Florence Gundidza](https://huggingface.co/Florence Gundidza) for evaluating this model. |
CVasNLPExperiments/cv-as-nlp-vision-example | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: train
num_bytes: 120297907.375
num_examples: 3669
download_size: 119028407
dataset_size: 120297907.375
---
# Dataset Card for "test_sub"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hemantk089/llama2_7b_fine_tuning_complete_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 261946
num_examples: 917
download_size: 70457
dataset_size: 261946
---
# Dataset Card for "llama2_7b_fine_tuning_complete_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mwalol/wikipapa | ---
annotations_creators:
- no-annotation
language_creators:
- crowdsourced
pretty_name: Wikipedia
paperswithcode_id: null
license:
- cc-by-sa-3.0
- gfdl
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
source_datasets:
- original
multilinguality:
- multilingual
size_categories:
- n<1K
- 1K<n<10K
- 10K<n<100K
- 100K<n<1M
- 1M<n<10M
language:
- aa
- ab
- ace
- af
- ak
- als
- am
- an
- ang
- ar
- arc
- arz
- as
- ast
- atj
- av
- ay
- az
- azb
- ba
- bar
- bcl
- be
- bg
- bh
- bi
- bjn
- bm
- bn
- bo
- bpy
- br
- bs
- bug
- bxr
- ca
- cbk
- cdo
- ce
- ceb
- ch
- cho
- chr
- chy
- ckb
- co
- cr
- crh
- cs
- csb
- cu
- cv
- cy
- da
- de
- din
- diq
- dsb
- dty
- dv
- dz
- ee
- el
- eml
- en
- eo
- es
- et
- eu
- ext
- fa
- ff
- fi
- fj
- fo
- fr
- frp
- frr
- fur
- fy
- ga
- gag
- gan
- gd
- gl
- glk
- gn
- gom
- gor
- got
- gu
- gv
- ha
- hak
- haw
- he
- hi
- hif
- ho
- hr
- hsb
- ht
- hu
- hy
- ia
- id
- ie
- ig
- ii
- ik
- ilo
- inh
- io
- is
- it
- iu
- ja
- jam
- jbo
- jv
- ka
- kaa
- kab
- kbd
- kbp
- kg
- ki
- kj
- kk
- kl
- km
- kn
- ko
- koi
- krc
- ks
- ksh
- ku
- kv
- kw
- ky
- la
- lad
- lb
- lbe
- lez
- lfn
- lg
- li
- lij
- lmo
- ln
- lo
- lrc
- lt
- ltg
- lv
- lzh
- mai
- mdf
- mg
- mh
- mhr
- mi
- min
- mk
- ml
- mn
- mr
- mrj
- ms
- mt
- mus
- mwl
- my
- myv
- mzn
- na
- nah
- nan
- nap
- nds
- ne
- new
- ng
- nl
- nn
- 'no'
- nov
- nrf
- nso
- nv
- ny
- oc
- olo
- om
- or
- os
- pa
- pag
- pam
- pap
- pcd
- pdc
- pfl
- pi
- pih
- pl
- pms
- pnb
- pnt
- ps
- pt
- qu
- rm
- rmy
- rn
- ro
- ru
- rue
- rup
- rw
- sa
- sah
- sat
- sc
- scn
- sco
- sd
- se
- sg
- sgs
- sh
- si
- sk
- sl
- sm
- sn
- so
- sq
- sr
- srn
- ss
- st
- stq
- su
- sv
- sw
- szl
- ta
- tcy
- tdt
- te
- tg
- th
- ti
- tk
- tl
- tn
- to
- tpi
- tr
- ts
- tt
- tum
- tw
- ty
- tyv
- udm
- ug
- uk
- ur
- uz
- ve
- vec
- vep
- vi
- vls
- vo
- vro
- wa
- war
- wo
- wuu
- xal
- xh
- xmf
- yi
- yo
- yue
- za
- zea
- zh
- zu
language_bcp47:
- nds-nl
config_names:
- 20240101.aa
- 20220101.ab
- 20240101.ace
- 20240101.ady
- 20240101.af
- 20240101.ak
- 20240101.als
- 20240101.am
- 20240101.an
- 20240101.ang
- 20240101.ar
- 20240101.arc
- 20240101.arz
- 20240101.as
- 20240101.ast
- 20240101.atj
- 20240101.av
- 20240101.ay
- 20240101.az
- 20240101.azb
- 20240101.ba
- 20240101.bar
- 20240101.bat-smg
- 20240101.bcl
- 20240101.be
- 20240101.be-x-old
- 20240101.bg
- 20240101.bh
- 20240101.bi
- 20240101.bjn
- 20240101.bm
- 20240101.bn
- 20240101.bo
- 20240101.bpy
- 20240101.br
- 20240101.bs
- 20240101.bug
- 20240101.bxr
- 20240101.ca
- 20240101.cbk-zam
- 20240101.cdo
- 20240101.ce
- 20240101.ceb
- 20240101.ch
- 20240101.cho
- 20240101.chr
- 20240101.chy
- 20240101.ckb
- 20240101.co
- 20240101.cr
- 20240101.crh
- 20240101.cs
- 20240101.csb
- 20240101.cu
- 20240101.cv
- 20240101.cy
- 20240101.da
- 20240101.de
- 20240101.din
- 20240101.diq
- 20240101.dsb
- 20240101.dty
- 20240101.dv
- 20240101.dz
- 20240101.ee
- 20240101.el
- 20240101.eml
- 20240101.en
- 20240101.eo
- 20240101.es
- 20240101.et
- 20240101.eu
- 20240101.ext
- 20240101.fa
- 20240101.ff
- 20240101.fi
- 20240101.fiu-vro
- 20240101.fj
- 20240101.fo
- 20240101.fr
- 20240101.frp
- 20240101.frr
- 20240101.fur
- 20240101.fy
- 20240101.ga
- 20240101.gag
- 20240101.gan
- 20240101.gd
- 20240101.gl
- 20240101.glk
- 20240101.gn
- 20240101.gom
- 20240101.gor
- 20240101.got
- 20240101.gu
- 20240101.gv
- 20240101.ha
- 20240101.hak
- 20240101.haw
- 20240101.he
- 20240101.hi
- 20240101.hif
- 20240101.ho
- 20240101.hr
- 20240101.hsb
- 20240101.ht
- 20240101.hu
- 20240101.hy
- 20240101.ia
- 20240101.id
- 20240101.ie
- 20240101.ig
- 20240101.ii
- 20240101.ik
- 20240101.ilo
- 20240101.inh
- 20240101.io
- 20240101.is
- 20240101.it
- 20240101.iu
- 20240101.ja
- 20240101.jam
- 20240101.jbo
- 20240101.jv
- 20240101.ka
- 20240101.kaa
- 20240101.kab
- 20240101.kbd
- 20240101.kbp
- 20240101.kg
- 20240101.ki
- 20240101.kj
- 20240101.kk
- 20240101.kl
- 20240101.km
- 20240101.kn
- 20240101.ko
- 20240101.koi
- 20240101.krc
- 20240101.ks
- 20240101.ksh
- 20240101.ku
- 20240101.kv
- 20240101.kw
- 20240101.ky
- 20240101.la
- 20240101.lad
- 20240101.lb
- 20240101.lbe
- 20240101.lez
- 20240101.lfn
- 20240101.lg
- 20240101.li
- 20240101.lij
- 20240101.lmo
- 20240101.ln
- 20240101.lo
- 20240101.lrc
- 20240101.lt
- 20240101.ltg
- 20240101.lv
- 20240101.mai
- 20240101.map-bms
- 20240101.mdf
- 20240101.mg
- 20240101.mh
- 20240101.mhr
- 20240101.mi
- 20240101.min
- 20240101.mk
- 20240101.ml
- 20240101.mn
- 20240101.mr
- 20240101.mrj
- 20240101.ms
- 20240101.mt
- 20240101.mus
- 20240101.mwl
- 20240101.my
- 20240101.myv
- 20240101.mzn
- 20240101.na
- 20240101.nah
- 20240101.nap
- 20240101.nds
- 20240101.nds-nl
- 20240101.ne
- 20240101.new
- 20240101.ng
- 20240101.nl
- 20240101.nn
- 20240101.no
- 20240101.nov
- 20240101.nrm
- 20240101.nso
- 20240101.nv
- 20240101.ny
- 20240101.oc
- 20240101.olo
- 20240101.om
- 20240101.or
- 20240101.os
- 20240101.pa
- 20240101.pag
- 20240101.pam
- 20240101.pap
- 20240101.pcd
- 20240101.pdc
- 20240101.pfl
- 20240101.pi
- 20240101.pih
- 20240101.pl
- 20240101.pms
- 20240101.pnb
- 20240101.pnt
- 20240101.ps
- 20240101.pt
- 20240101.qu
- 20240101.rm
- 20240101.rmy
- 20240101.rn
- 20240101.ro
- 20240101.roa-rup
- 20240101.roa-tara
- 20240101.ru
- 20240101.rue
- 20240101.rw
- 20240101.sa
- 20240101.sah
- 20240101.sat
- 20240101.sc
- 20240101.scn
- 20240101.sco
- 20240101.sd
- 20240101.se
- 20240101.sg
- 20240101.sh
- 20240101.si
- 20240101.simple
- 20240101.sk
- 20240101.sl
- 20240101.sm
- 20240101.sn
- 20240101.so
- 20240101.sq
- 20240101.sr
- 20240101.srn
- 20240101.ss
- 20240101.st
- 20240101.stq
- 20240101.su
- 20240101.sv
- 20240101.sw
- 20240101.szl
- 20240101.ta
- 20240101.tcy
- 20240101.te
- 20240101.tet
- 20240101.tg
- 20240101.th
- 20240101.ti
- 20240101.tk
- 20240101.tl
- 20240101.tn
- 20240101.to
- 20240101.tpi
- 20240101.tr
- 20240101.ts
- 20240101.tt
- 20240101.tum
- 20240101.tw
- 20240101.ty
- 20240101.tyv
- 20240101.udm
- 20240101.ug
- 20240101.uk
- 20240101.ur
- 20240101.uz
- 20240101.ve
- 20240101.vec
- 20240101.vep
- 20240101.vi
- 20240101.vls
- 20240101.vo
- 20240101.wa
- 20240101.war
- 20240101.wo
- 20240101.wuu
- 20240101.xal
- 20240101.xh
- 20240101.xmf
- 20240101.yi
- 20240101.yo
- 20240101.za
- 20240101.zea
- 20240101.zh
- 20240101.zh-classical
- 20240101.zh-min-nan
- 20240101.zh-yue
- 20240101.zu
---
# Dataset Card for Wikipedia
This repo is a fork of the [olm/wikipedia](https://huggingface.co/datasets/olm/wikipedia) repo which itself is a fork of the original Hugging Face Wikipedia repo [here](https://huggingface.co/datasets/wikipedia).
This fork modifies `olm/wikipedia` to enable running on lower resourced machines. These changes have been proposed as a [PR with the olm/wikipedia project](https://huggingface.co/datasets/olm/wikipedia/discussions/6).
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://dumps.wikimedia.org](https://dumps.wikimedia.org)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
Wikipedia dataset containing cleaned articles of all languages. The datasets are built from the Wikipedia dump (https://dumps.wikimedia.org/) with one split per language. Each example contains the content of one full Wikipedia article with cleaning to strip markdown and unwanted sections (references, etc.).
The articles are parsed using the ``mwparserfromhell`` tool.
To load this dataset you need to install the following dependencies:
```
pip install mwparserfromhell datasets
```
Then, you can load any subset of Wikipedia per language and per date this way:
```python
from datasets import load_dataset
load_dataset("neuml/wikipedia", language="en", date="20240101")
```
You can find the full list of languages and dates [here](https://dumps.wikimedia.org/backup-index.html).
### Supported Tasks and Leaderboards
The dataset is generally used for Language Modeling.
### Languages
You can find the list of languages [here](https://meta.wikimedia.org/wiki/List_of_Wikipedias).
## Dataset Structure
### Data Instances
An example looks as follows:
```
{'id': '1',
'url': 'https://simple.wikipedia.org/wiki/April',
'title': 'April',
'text': 'April is the fourth month...'
}
```
### Data Fields
The data fields are the same among all configurations:
- `id` (`str`): ID of the article.
- `url` (`str`): URL of the article.
- `title` (`str`): Title of the article.
- `text` (`str`): Text content of the article.
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
Most of Wikipedia's text and many of its images are co-licensed under the [Creative Commons Attribution-ShareAlike 3.0 Unported License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_Creative_Commons_Attribution-ShareAlike_3.0_Unported_License)(CC BY-SA) and the [GNU Free Documentation License](https://en.wikipedia.org/wiki/Wikipedia:Text_of_the_GNU_Free_Documentation_License)(GFDL) (unversioned, with no invariant sections, front-cover texts, or back-cover texts).
Some text has been imported only under CC BY-SA and CC BY-SA-compatible license and cannot be reused under GFDL; such text will be identified on the page footer, in the page history, or on the discussion page of the article that utilizes the text.
### Citation Information
```
@ONLINE{wikidump,
author = "Wikimedia Foundation",
title = "Wikimedia Downloads",
url = "https://dumps.wikimedia.org"
}
``` |
DBQ/Prada.Product.prices.Austria | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Austria - Prada - Product-level price list
tags:
- webscraping
- ecommerce
- Prada
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: string
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 1280254
num_examples: 2545
download_size: 387267
dataset_size: 1280254
---
# Prada web scraped data
## About the website
The fashion industry, particularly the luxury fashion segment, exhibits a vast and dynamic scope in the Europe, Middle East, and Africa (EMEA) region, with Austria playing a crucial role in its positive trajectory. **Prada**, a prominent luxury fashion icon, continues to thrive in Austrias competitive market. **The industry** is significantly propelled by advancements in technology, leading to a surge in **Ecommerce** platforms. A recent dataset provides insights into the **Ecommerce product-list page (PLP) data** of Prada in Austria. Such data is critical in understanding consumer preferences, purchasing patterns and overall market trends, subsequently influencing strategic business decisions and marketing campaigns.
## Link to **dataset**
[Austria - Prada - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Prada%20Product-prices%20Austria/r/recYuVyhn9tiSVuIh)
|
nickmuchi/financial-text-combo-classification | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 1989291
num_examples: 17971
- name: validation
num_bytes: 414441
num_examples: 3863
download_size: 1463828
dataset_size: 2403732
task_categories:
- text-classification
task_ids:
- multi-class-classification
- sentiment-classification
size_categories:
- 10K<n<100K
language:
- en
pretty_name: FinTextComboClassification
tags:
- finance
---
# Dataset Card for "financial-text-combo-classification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DFKI-SLT/fabner | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- found
license:
- other
multilinguality:
- monolingual
pretty_name: FabNER is a manufacturing text dataset for Named Entity Recognition.
size_categories:
- 10K<n<100K
source_datasets: []
tags:
- manufacturing
- 2000-2020
task_categories:
- token-classification
task_ids:
- named-entity-recognition
dataset_info:
- config_name: fabner
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-MATE
'2': I-MATE
'3': O-MATE
'4': E-MATE
'5': S-MATE
'6': B-MANP
'7': I-MANP
'8': O-MANP
'9': E-MANP
'10': S-MANP
'11': B-MACEQ
'12': I-MACEQ
'13': O-MACEQ
'14': E-MACEQ
'15': S-MACEQ
'16': B-APPL
'17': I-APPL
'18': O-APPL
'19': E-APPL
'20': S-APPL
'21': B-FEAT
'22': I-FEAT
'23': O-FEAT
'24': E-FEAT
'25': S-FEAT
'26': B-PRO
'27': I-PRO
'28': O-PRO
'29': E-PRO
'30': S-PRO
'31': B-CHAR
'32': I-CHAR
'33': O-CHAR
'34': E-CHAR
'35': S-CHAR
'36': B-PARA
'37': I-PARA
'38': O-PARA
'39': E-PARA
'40': S-PARA
'41': B-ENAT
'42': I-ENAT
'43': O-ENAT
'44': E-ENAT
'45': S-ENAT
'46': B-CONPRI
'47': I-CONPRI
'48': O-CONPRI
'49': E-CONPRI
'50': S-CONPRI
'51': B-MANS
'52': I-MANS
'53': O-MANS
'54': E-MANS
'55': S-MANS
'56': B-BIOP
'57': I-BIOP
'58': O-BIOP
'59': E-BIOP
'60': S-BIOP
splits:
- name: train
num_bytes: 4394010
num_examples: 9435
- name: validation
num_bytes: 934347
num_examples: 2183
- name: test
num_bytes: 940136
num_examples: 2064
download_size: 3793613
dataset_size: 6268493
- config_name: fabner_bio
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-MATE
'2': I-MATE
'3': B-MANP
'4': I-MANP
'5': B-MACEQ
'6': I-MACEQ
'7': B-APPL
'8': I-APPL
'9': B-FEAT
'10': I-FEAT
'11': B-PRO
'12': I-PRO
'13': B-CHAR
'14': I-CHAR
'15': B-PARA
'16': I-PARA
'17': B-ENAT
'18': I-ENAT
'19': B-CONPRI
'20': I-CONPRI
'21': B-MANS
'22': I-MANS
'23': B-BIOP
'24': I-BIOP
splits:
- name: train
num_bytes: 4394010
num_examples: 9435
- name: validation
num_bytes: 934347
num_examples: 2183
- name: test
num_bytes: 940136
num_examples: 2064
download_size: 3793613
dataset_size: 6268493
- config_name: fabner_simple
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': MATE
'2': MANP
'3': MACEQ
'4': APPL
'5': FEAT
'6': PRO
'7': CHAR
'8': PARA
'9': ENAT
'10': CONPRI
'11': MANS
'12': BIOP
splits:
- name: train
num_bytes: 4394010
num_examples: 9435
- name: validation
num_bytes: 934347
num_examples: 2183
- name: test
num_bytes: 940136
num_examples: 2064
download_size: 3793613
dataset_size: 6268493
- config_name: text2tech
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': Technological System
'2': Method
'3': Material
'4': Technical Field
splits:
- name: train
num_bytes: 4394010
num_examples: 9435
- name: validation
num_bytes: 934347
num_examples: 2183
- name: test
num_bytes: 940136
num_examples: 2064
download_size: 3793613
dataset_size: 6268493
---
# Dataset Card for FabNER
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://figshare.com/articles/dataset/Dataset_NER_Manufacturing_-_FabNER_Information_Extraction_from_Manufacturing_Process_Science_Domain_Literature_Using_Named_Entity_Recognition/14782407](https://figshare.com/articles/dataset/Dataset_NER_Manufacturing_-_FabNER_Information_Extraction_from_Manufacturing_Process_Science_Domain_Literature_Using_Named_Entity_Recognition/14782407)
- **Paper:** ["FabNER": information extraction from manufacturing process science domain literature using named entity recognition](https://par.nsf.gov/servlets/purl/10290810)
- **Size of downloaded dataset files:** 3.79 MB
- **Size of the generated dataset:** 6.27 MB
### Dataset Summary
FabNER is a manufacturing text corpus of 350,000+ words for Named Entity Recognition.
It is a collection of abstracts obtained from Web of Science through known journals available in manufacturing process
science research.
For every word, there were categories/entity labels defined, namely Material (MATE), Manufacturing Process (MANP),
Machine/Equipment (MACEQ), Application (APPL), Features (FEAT), Mechanical Properties (PRO), Characterization (CHAR),
Parameters (PARA), Enabling Technology (ENAT), Concept/Principles (CONPRI), Manufacturing Standards (MANS) and
BioMedical (BIOP). Annotation was performed in all categories along with the output tag in 'BIOES' format:
B=Beginning, I-Intermediate, O=Outside, E=End, S=Single.
For details about the dataset, please refer to the paper: ["FabNER": information extraction from manufacturing process science domain literature using named entity recognition](https://par.nsf.gov/servlets/purl/10290810)
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
The language in the dataset is English.
## Dataset Structure
### Data Instances
- **Size of downloaded dataset files:** 3.79 MB
- **Size of the generated dataset:** 6.27 MB
An example of 'train' looks as follows:
```json
{
"id": "0",
"tokens": ["Revealed", "the", "location-specific", "flow", "patterns", "and", "quantified", "the", "speeds", "of", "various", "types", "of", "flow", "."],
"ner_tags": [0, 0, 0, 46, 49, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
}
```
### Data Fields
#### fabner
- `id`: the instance id of this sentence, a `string` feature.
- `tokens`: the list of tokens of this sentence, a `list` of `string` features.
- `ner_tags`: the list of entity tags, a `list` of classification labels.
```json
{"O": 0, "B-MATE": 1, "I-MATE": 2, "O-MATE": 3, "E-MATE": 4, "S-MATE": 5, "B-MANP": 6, "I-MANP": 7, "O-MANP": 8, "E-MANP": 9, "S-MANP": 10, "B-MACEQ": 11, "I-MACEQ": 12, "O-MACEQ": 13, "E-MACEQ": 14, "S-MACEQ": 15, "B-APPL": 16, "I-APPL": 17, "O-APPL": 18, "E-APPL": 19, "S-APPL": 20, "B-FEAT": 21, "I-FEAT": 22, "O-FEAT": 23, "E-FEAT": 24, "S-FEAT": 25, "B-PRO": 26, "I-PRO": 27, "O-PRO": 28, "E-PRO": 29, "S-PRO": 30, "B-CHAR": 31, "I-CHAR": 32, "O-CHAR": 33, "E-CHAR": 34, "S-CHAR": 35, "B-PARA": 36, "I-PARA": 37, "O-PARA": 38, "E-PARA": 39, "S-PARA": 40, "B-ENAT": 41, "I-ENAT": 42, "O-ENAT": 43, "E-ENAT": 44, "S-ENAT": 45, "B-CONPRI": 46, "I-CONPRI": 47, "O-CONPRI": 48, "E-CONPRI": 49, "S-CONPRI": 50, "B-MANS": 51, "I-MANS": 52, "O-MANS": 53, "E-MANS": 54, "S-MANS": 55, "B-BIOP": 56, "I-BIOP": 57, "O-BIOP": 58, "E-BIOP": 59, "S-BIOP": 60}
```
#### fabner_bio
- `id`: the instance id of this sentence, a `string` feature.
- `tokens`: the list of tokens of this sentence, a `list` of `string` features.
- `ner_tags`: the list of entity tags, a `list` of classification labels.
```json
{"O": 0, "B-MATE": 1, "I-MATE": 2, "B-MANP": 3, "I-MANP": 4, "B-MACEQ": 5, "I-MACEQ": 6, "B-APPL": 7, "I-APPL": 8, "B-FEAT": 9, "I-FEAT": 10, "B-PRO": 11, "I-PRO": 12, "B-CHAR": 13, "I-CHAR": 14, "B-PARA": 15, "I-PARA": 16, "B-ENAT": 17, "I-ENAT": 18, "B-CONPRI": 19, "I-CONPRI": 20, "B-MANS": 21, "I-MANS": 22, "B-BIOP": 23, "I-BIOP": 24}
```
#### fabner_simple
- `id`: the instance id of this sentence, a `string` feature.
- `tokens`: the list of tokens of this sentence, a `list` of `string` features.
- `ner_tags`: the list of entity tags, a `list` of classification labels.
```json
{"O": 0, "MATE": 1, "MANP": 2, "MACEQ": 3, "APPL": 4, "FEAT": 5, "PRO": 6, "CHAR": 7, "PARA": 8, "ENAT": 9, "CONPRI": 10, "MANS": 11, "BIOP": 12}
```
#### text2tech
- `id`: the instance id of this sentence, a `string` feature.
- `tokens`: the list of tokens of this sentence, a `list` of `string` features.
- `ner_tags`: the list of entity tags, a `list` of classification labels.
```json
{"O": 0, "Technological System": 1, "Method": 2, "Material": 3, "Technical Field": 4}
```
### Data Splits
| | Train | Dev | Test |
|--------|-------|------|------|
| fabner | 9435 | 2183 | 2064 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{DBLP:journals/jim/KumarS22,
author = {Aman Kumar and
Binil Starly},
title = {"FabNER": information extraction from manufacturing process science
domain literature using named entity recognition},
journal = {J. Intell. Manuf.},
volume = {33},
number = {8},
pages = {2393--2407},
year = {2022},
url = {https://doi.org/10.1007/s10845-021-01807-x},
doi = {10.1007/s10845-021-01807-x},
timestamp = {Sun, 13 Nov 2022 17:52:57 +0100},
biburl = {https://dblp.org/rec/journals/jim/KumarS22.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
### Contributions
Thanks to [@phucdev](https://github.com/phucdev) for adding this dataset. |
lhallee/BP_fold | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: seqs
dtype: string
- name: labels
dtype: string
splits:
- name: train
num_bytes: 167079152
num_examples: 26224
- name: valid
num_bytes: 18475462
num_examples: 2904
- name: test
num_bytes: 21781312
num_examples: 3350
download_size: 23395626
dataset_size: 207335926
---
# Dataset Card for "BP_fold"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
diiogo/enem_v2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 1853620
num_examples: 2368
download_size: 1230138
dataset_size: 1853620
---
# Dataset Card for "enem_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigIR/AuFIN | ---
language:
- ar
pretty_name: AuFIN
---
This is an Arabic dataset for Authority FINding in Twitter. We share the top 5 users retrieved using the BM25 lexical retrieval model where the query is the rumor text, and the documents collection is the users documents. Each user document is constructed by concatentating his translated profile name and description, and all his translated Twitter lists names and descriptions.
Full dataset can be found [here](https://github.com/Fatima-Haouari/AuFIN) and test data [here](https://gitlab.com/checkthat_lab/clef2023-checkthat-lab/-/tree/main/task5?ref_type=heads)
This work is published as an IP&M journal paper titled [Who can verify this? Finding authorities for rumor verification in Twitter](https://www.sciencedirect.com/science/article/pii/S0306457323001036) |
SummerSigh/AncientMNIST | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Alpha
'1': Beta
'2': Chi
'3': Delta
'4': Epsilon
'5': Eta
'6': Gamma
'7': Iota
'8': Kappa
'9': Lambda
'10': LunateSigma
'11': Mu
'12': Nu
'13': Omega
'14': Omicron
'15': Phi
'16': Pi
'17': Psi
'18': Rho
'19': Tau
'20': Theta
'21': Upsilon
'22': Xi
'23': Zeta
splits:
- name: train
num_bytes: 309609553.26
num_examples: 205797
download_size: 217254607
dataset_size: 309609553.26
---
# Dataset Card for "AncientMNIST"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_yunconglong__DARE_TIES_13B | ---
pretty_name: Evaluation run of yunconglong/DARE_TIES_13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yunconglong/DARE_TIES_13B](https://huggingface.co/yunconglong/DARE_TIES_13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yunconglong__DARE_TIES_13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T19:46:16.300212](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__DARE_TIES_13B/blob/main/results_2024-02-01T19-46-16.300212.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6515889964448247,\n\
\ \"acc_stderr\": 0.03210174624245573,\n \"acc_norm\": 0.6506057322516777,\n\
\ \"acc_norm_stderr\": 0.03278666812910722,\n \"mc1\": 0.6352509179926561,\n\
\ \"mc1_stderr\": 0.016850961061720137,\n \"mc2\": 0.7865638980237093,\n\
\ \"mc2_stderr\": 0.01379067926936144\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7167235494880546,\n \"acc_stderr\": 0.013167478735134575,\n\
\ \"acc_norm\": 0.7431740614334471,\n \"acc_norm_stderr\": 0.0127669237941168\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7263493328022307,\n\
\ \"acc_stderr\": 0.00444920629592239,\n \"acc_norm\": 0.895040828520215,\n\
\ \"acc_norm_stderr\": 0.0030587440442413545\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.023287665127268542,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.023287665127268542\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198892,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198892\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078962,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078962\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45363128491620114,\n\
\ \"acc_stderr\": 0.016650437588269076,\n \"acc_norm\": 0.45363128491620114,\n\
\ \"acc_norm_stderr\": 0.016650437588269076\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667874,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n\
\ \"acc_stderr\": 0.012749206007657474,\n \"acc_norm\": 0.47131681877444587,\n\
\ \"acc_norm_stderr\": 0.012749206007657474\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.018798086284886887,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.018798086284886887\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6352509179926561,\n\
\ \"mc1_stderr\": 0.016850961061720137,\n \"mc2\": 0.7865638980237093,\n\
\ \"mc2_stderr\": 0.01379067926936144\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8808208366219415,\n \"acc_stderr\": 0.009105988620006186\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6755117513267627,\n \
\ \"acc_stderr\": 0.012896095359768114\n }\n}\n```"
repo_url: https://huggingface.co/yunconglong/DARE_TIES_13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|arc:challenge|25_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|gsm8k|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hellaswag|10_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T19-46-16.300212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T19-46-16.300212.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- '**/details_harness|winogrande|5_2024-02-01T19-46-16.300212.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T19-46-16.300212.parquet'
- config_name: results
data_files:
- split: 2024_02_01T19_46_16.300212
path:
- results_2024-02-01T19-46-16.300212.parquet
- split: latest
path:
- results_2024-02-01T19-46-16.300212.parquet
---
# Dataset Card for Evaluation run of yunconglong/DARE_TIES_13B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [yunconglong/DARE_TIES_13B](https://huggingface.co/yunconglong/DARE_TIES_13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yunconglong__DARE_TIES_13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T19:46:16.300212](https://huggingface.co/datasets/open-llm-leaderboard/details_yunconglong__DARE_TIES_13B/blob/main/results_2024-02-01T19-46-16.300212.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6515889964448247,
"acc_stderr": 0.03210174624245573,
"acc_norm": 0.6506057322516777,
"acc_norm_stderr": 0.03278666812910722,
"mc1": 0.6352509179926561,
"mc1_stderr": 0.016850961061720137,
"mc2": 0.7865638980237093,
"mc2_stderr": 0.01379067926936144
},
"harness|arc:challenge|25": {
"acc": 0.7167235494880546,
"acc_stderr": 0.013167478735134575,
"acc_norm": 0.7431740614334471,
"acc_norm_stderr": 0.0127669237941168
},
"harness|hellaswag|10": {
"acc": 0.7263493328022307,
"acc_stderr": 0.00444920629592239,
"acc_norm": 0.895040828520215,
"acc_norm_stderr": 0.0030587440442413545
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544064,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544064
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268542,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268542
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198892,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198892
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601436,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45363128491620114,
"acc_stderr": 0.016650437588269076,
"acc_norm": 0.45363128491620114,
"acc_norm_stderr": 0.016650437588269076
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667874,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47131681877444587,
"acc_stderr": 0.012749206007657474,
"acc_norm": 0.47131681877444587,
"acc_norm_stderr": 0.012749206007657474
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.018798086284886887,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.018798086284886887
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6352509179926561,
"mc1_stderr": 0.016850961061720137,
"mc2": 0.7865638980237093,
"mc2_stderr": 0.01379067926936144
},
"harness|winogrande|5": {
"acc": 0.8808208366219415,
"acc_stderr": 0.009105988620006186
},
"harness|gsm8k|5": {
"acc": 0.6755117513267627,
"acc_stderr": 0.012896095359768114
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
irds/beir_hotpotqa_dev | ---
pretty_name: '`beir/hotpotqa/dev`'
viewer: false
source_datasets: ['irds/beir_hotpotqa']
task_categories:
- text-retrieval
---
# Dataset Card for `beir/hotpotqa/dev`
The `beir/hotpotqa/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/beir#beir/hotpotqa/dev).
# Data
This dataset provides:
- `queries` (i.e., topics); count=5,447
- `qrels`: (relevance assessments); count=10,894
- For `docs`, use [`irds/beir_hotpotqa`](https://huggingface.co/datasets/irds/beir_hotpotqa)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/beir_hotpotqa_dev', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/beir_hotpotqa_dev', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Yang2018Hotpotqa,
title = "{H}otpot{QA}: A Dataset for Diverse, Explainable Multi-hop Question Answering",
author = "Yang, Zhilin and
Qi, Peng and
Zhang, Saizheng and
Bengio, Yoshua and
Cohen, William and
Salakhutdinov, Ruslan and
Manning, Christopher D.",
booktitle = "Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing",
month = oct # "-" # nov,
year = "2018",
address = "Brussels, Belgium",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/D18-1259",
doi = "10.18653/v1/D18-1259",
pages = "2369--2380"
}
@article{Thakur2021Beir,
title = "BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information Retrieval Models",
author = "Thakur, Nandan and Reimers, Nils and Rücklé, Andreas and Srivastava, Abhishek and Gurevych, Iryna",
journal= "arXiv preprint arXiv:2104.08663",
month = "4",
year = "2021",
url = "https://arxiv.org/abs/2104.08663",
}
```
|
liuyanchen1015/MULTI_VALUE_rte_were_was | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 145118
num_examples: 343
- name: train
num_bytes: 142215
num_examples: 303
download_size: 191756
dataset_size: 287333
---
# Dataset Card for "MULTI_VALUE_rte_were_was"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mainakhf/orca-llama2-10k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 15821903
num_examples: 10000
download_size: 9170883
dataset_size: 15821903
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-55000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 644973
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_rizla__trrapi-16b | ---
pretty_name: Evaluation run of rizla/trrapi-16b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rizla/trrapi-16b](https://huggingface.co/rizla/trrapi-16b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rizla__trrapi-16b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-03T21:35:54.885186](https://huggingface.co/datasets/open-llm-leaderboard/details_rizla__trrapi-16b/blob/main/results_2024-02-03T21-35-54.885186.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6475033667590542,\n\
\ \"acc_stderr\": 0.032124967055002625,\n \"acc_norm\": 0.648064835420934,\n\
\ \"acc_norm_stderr\": 0.03279129587010941,\n \"mc1\": 0.5826193390452876,\n\
\ \"mc1_stderr\": 0.017262891063272168,\n \"mc2\": 0.7413221252292123,\n\
\ \"mc2_stderr\": 0.014409709803356395\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6834470989761092,\n \"acc_stderr\": 0.013592431519068079,\n\
\ \"acc_norm\": 0.7209897610921502,\n \"acc_norm_stderr\": 0.013106784883601336\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7129057956582354,\n\
\ \"acc_stderr\": 0.004514813363221144,\n \"acc_norm\": 0.8887671778530173,\n\
\ \"acc_norm_stderr\": 0.0031377764442772\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474887,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474887\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217483,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217483\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8532110091743119,\n \"acc_stderr\": 0.01517314184512625,\n \"\
acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.01517314184512625\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229091,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229091\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\
\ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\
\ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40558659217877097,\n\
\ \"acc_stderr\": 0.016421670506339185,\n \"acc_norm\": 0.40558659217877097,\n\
\ \"acc_norm_stderr\": 0.016421670506339185\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4784876140808344,\n\
\ \"acc_stderr\": 0.012758410941038911,\n \"acc_norm\": 0.4784876140808344,\n\
\ \"acc_norm_stderr\": 0.012758410941038911\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.018798086284886887,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.018798086284886887\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5826193390452876,\n\
\ \"mc1_stderr\": 0.017262891063272168,\n \"mc2\": 0.7413221252292123,\n\
\ \"mc2_stderr\": 0.014409709803356395\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8634569850039463,\n \"acc_stderr\": 0.009650242900291614\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6118271417740713,\n \
\ \"acc_stderr\": 0.013423607564002755\n }\n}\n```"
repo_url: https://huggingface.co/rizla/trrapi-16b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|arc:challenge|25_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|gsm8k|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hellaswag|10_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T21-35-54.885186.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T21-35-54.885186.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- '**/details_harness|winogrande|5_2024-02-03T21-35-54.885186.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-03T21-35-54.885186.parquet'
- config_name: results
data_files:
- split: 2024_02_03T21_35_54.885186
path:
- results_2024-02-03T21-35-54.885186.parquet
- split: latest
path:
- results_2024-02-03T21-35-54.885186.parquet
---
# Dataset Card for Evaluation run of rizla/trrapi-16b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rizla/trrapi-16b](https://huggingface.co/rizla/trrapi-16b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rizla__trrapi-16b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T21:35:54.885186](https://huggingface.co/datasets/open-llm-leaderboard/details_rizla__trrapi-16b/blob/main/results_2024-02-03T21-35-54.885186.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6475033667590542,
"acc_stderr": 0.032124967055002625,
"acc_norm": 0.648064835420934,
"acc_norm_stderr": 0.03279129587010941,
"mc1": 0.5826193390452876,
"mc1_stderr": 0.017262891063272168,
"mc2": 0.7413221252292123,
"mc2_stderr": 0.014409709803356395
},
"harness|arc:challenge|25": {
"acc": 0.6834470989761092,
"acc_stderr": 0.013592431519068079,
"acc_norm": 0.7209897610921502,
"acc_norm_stderr": 0.013106784883601336
},
"harness|hellaswag|10": {
"acc": 0.7129057956582354,
"acc_stderr": 0.004514813363221144,
"acc_norm": 0.8887671778530173,
"acc_norm_stderr": 0.0031377764442772
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474887,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474887
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217483,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217483
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121427,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121427
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.01517314184512625,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.01517314184512625
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229091,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229091
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40558659217877097,
"acc_stderr": 0.016421670506339185,
"acc_norm": 0.40558659217877097,
"acc_norm_stderr": 0.016421670506339185
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035457,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035457
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4784876140808344,
"acc_stderr": 0.012758410941038911,
"acc_norm": 0.4784876140808344,
"acc_norm_stderr": 0.012758410941038911
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.018798086284886887,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.018798086284886887
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5826193390452876,
"mc1_stderr": 0.017262891063272168,
"mc2": 0.7413221252292123,
"mc2_stderr": 0.014409709803356395
},
"harness|winogrande|5": {
"acc": 0.8634569850039463,
"acc_stderr": 0.009650242900291614
},
"harness|gsm8k|5": {
"acc": 0.6118271417740713,
"acc_stderr": 0.013423607564002755
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lukmanprasetyo/rockpaperscissors | ---
license: mit
---
|
CATIE-AQ/taln-archives_fr_prompt_keywords_extraction | ---
language:
- fr
license:
- cc-by-4.0
size_categories:
- 10K<n<100K
task_categories:
- text-generation
tags:
- keywords-extraction
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- taln-ls2n/taln-archives
---
# taln-archives_fr_prompt_keywords_extraction
## Summary
**taln-archives_fr_prompt_keywords_extraction** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **24,507** rows that can be used for a keywords_extraction task.
The original data (without prompts) comes from the dataset [taln-archives](https://huggingface.co/datasets/taln-ls2n/taln-archives).
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
21 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'Extraire les mots clés importants du texte suivant : '+text,
'Extrais les mots clés importants du texte suivant : '+text,
'Extrayez les mots clés importants du texte suivant : '+text,
'Isoler les mots clés importants du texte suivant : '+text,
'Isole les mots clés importants du texte suivant : '+text,
'Isolez les mots clés importants du texte suivant : '+text,
'Dégager des mots clés dans le texte : '+text,
'Dégage des mots clés dans le texte : '+text,
'Dégagez des mots clés dans le texte : '+text,
'Générer des mots clés issus du texte suivant : '+text,
'Génère des mots clés issus du texte suivant : '+text,
'Générez des mots clés issus du texte suivant : '+text,
'Trouver les mots clés du texte : '+text,
'Trouve les mots clés du texte : '+text,
'Trouvez les mots clés du texte : '+text,
'Repérer les mots clés importants présents dans le texte suivant : '+text,
'Repère les mots clés importants présents dans le texte suivant : '+text,
'Repérez les mots clés importants présents dans le texte suivant : '+text,
'Indiquer les mots clés du texte : '+text,
'Indiquer les mots clés du texte : '+text,
'Indiquer les mots clés du texte : '+text
```
# Splits
- `train` with 24,507 samples
- no `valid` split
- no `test` split
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/taln-archives_fr_prompt_keywords_extraction")
```
# Citation
## Original data
> - (Boudin, 2013) Florian Boudin. 2013.
[TALN Archives : a digital archive of French research articles in Natural Language Processing (TALN Archives : une archive numérique francophone des articles de recherche en Traitement Automatique de la Langue) [in French]][boudin-2013].
In Proceedings of TALN 2013 (Volume 2: Short Papers), pages 507–514, Les Sables d’Olonne, France. ATALA.
>- (Boudin and Gallina, 2021) Florian Boudin and Ygor Gallina. 2021.
[Redefining Absent Keyphrases and their Effect on Retrieval Effectiveness][boudin-2021].
In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4185–4193, Online. Association for Computational Linguistics.
[boudin-2013]: https://aclanthology.org/F13-2001/
[boudin-2021]: https://aclanthology.org/2021.naacl-main.330/
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
cc-by-4.0 |
open-llm-leaderboard/details_jordiclive__gpt4all-alpaca-oa-codealpaca-lora-13b | ---
pretty_name: Evaluation run of jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b](https://huggingface.co/jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jordiclive__gpt4all-alpaca-oa-codealpaca-lora-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T20:03:30.331669](https://huggingface.co/datasets/open-llm-leaderboard/details_jordiclive__gpt4all-alpaca-oa-codealpaca-lora-13b/blob/main/results_2023-09-22T20-03-30.331669.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n\
\ \"em_stderr\": 0.000456667646266702,\n \"f1\": 0.05642302852349,\n\
\ \"f1_stderr\": 0.0012977737732540458,\n \"acc\": 0.41872834230806744,\n\
\ \"acc_stderr\": 0.009633077195432445\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.000456667646266702,\n\
\ \"f1\": 0.05642302852349,\n \"f1_stderr\": 0.0012977737732540458\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0758150113722517,\n \
\ \"acc_stderr\": 0.0072912057231625796\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702311\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|arc:challenge|25_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T20_03_30.331669
path:
- '**/details_harness|drop|3_2023-09-22T20-03-30.331669.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T20-03-30.331669.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T20_03_30.331669
path:
- '**/details_harness|gsm8k|5_2023-09-22T20-03-30.331669.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T20-03-30.331669.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hellaswag|10_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T17:57:53.688517.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T17:57:53.688517.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T17:57:53.688517.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T20_03_30.331669
path:
- '**/details_harness|winogrande|5_2023-09-22T20-03-30.331669.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T20-03-30.331669.parquet'
- config_name: results
data_files:
- split: 2023_08_09T17_57_53.688517
path:
- results_2023-08-09T17:57:53.688517.parquet
- split: 2023_09_22T20_03_30.331669
path:
- results_2023-09-22T20-03-30.331669.parquet
- split: latest
path:
- results_2023-09-22T20-03-30.331669.parquet
---
# Dataset Card for Evaluation run of jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b](https://huggingface.co/jordiclive/gpt4all-alpaca-oa-codealpaca-lora-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jordiclive__gpt4all-alpaca-oa-codealpaca-lora-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T20:03:30.331669](https://huggingface.co/datasets/open-llm-leaderboard/details_jordiclive__gpt4all-alpaca-oa-codealpaca-lora-13b/blob/main/results_2023-09-22T20-03-30.331669.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.000456667646266702,
"f1": 0.05642302852349,
"f1_stderr": 0.0012977737732540458,
"acc": 0.41872834230806744,
"acc_stderr": 0.009633077195432445
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.000456667646266702,
"f1": 0.05642302852349,
"f1_stderr": 0.0012977737732540458
},
"harness|gsm8k|5": {
"acc": 0.0758150113722517,
"acc_stderr": 0.0072912057231625796
},
"harness|winogrande|5": {
"acc": 0.7616416732438832,
"acc_stderr": 0.011974948667702311
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
satwikapaul/braille_dataset_2 | ---
license: openrail
---
|
DJFelipeBR/carlospalacio | ---
license: openrail
---
|
open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B-QLoRA | ---
pretty_name: Evaluation run of xzuyn/LLaMa-2-PeanutButter_v4-7B-QLoRA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xzuyn/LLaMa-2-PeanutButter_v4-7B-QLoRA](https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v4-7B-QLoRA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B-QLoRA\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T22:56:12.065154](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B-QLoRA/blob/main/results_2023-08-29T22%3A56%3A12.065154.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.438823528740988,\n\
\ \"acc_stderr\": 0.035260068155448576,\n \"acc_norm\": 0.44253606128507456,\n\
\ \"acc_norm_stderr\": 0.035246174415990414,\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.4191863436208715,\n\
\ \"mc2_stderr\": 0.015793546690441883\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4880546075085324,\n \"acc_stderr\": 0.014607220340597171,\n\
\ \"acc_norm\": 0.5204778156996587,\n \"acc_norm_stderr\": 0.01459913135303501\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6022704640509858,\n\
\ \"acc_stderr\": 0.004884287515461491,\n \"acc_norm\": 0.788886675960964,\n\
\ \"acc_norm_stderr\": 0.004072645874992222\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.43018867924528303,\n \"acc_stderr\": 0.030471445867183235,\n\
\ \"acc_norm\": 0.43018867924528303,\n \"acc_norm_stderr\": 0.030471445867183235\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179964,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179964\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30687830687830686,\n \"acc_stderr\": 0.023752928712112147,\n \"\
acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.023752928712112147\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.47096774193548385,\n \"acc_stderr\": 0.028396016402761005,\n \"\
acc_norm\": 0.47096774193548385,\n \"acc_norm_stderr\": 0.028396016402761005\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3448275862068966,\n \"acc_stderr\": 0.033442837442804574,\n \"\
acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.033442837442804574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.03895658065271846,\n\
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.03895658065271846\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5404040404040404,\n \"acc_stderr\": 0.035507024651313425,\n \"\
acc_norm\": 0.5404040404040404,\n \"acc_norm_stderr\": 0.035507024651313425\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6632124352331606,\n \"acc_stderr\": 0.03410780251836184,\n\
\ \"acc_norm\": 0.6632124352331606,\n \"acc_norm_stderr\": 0.03410780251836184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4025641025641026,\n \"acc_stderr\": 0.024864995159767755,\n\
\ \"acc_norm\": 0.4025641025641026,\n \"acc_norm_stderr\": 0.024864995159767755\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823019,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823019\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36554621848739494,\n \"acc_stderr\": 0.0312821770636846,\n \
\ \"acc_norm\": 0.36554621848739494,\n \"acc_norm_stderr\": 0.0312821770636846\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6073394495412844,\n \"acc_stderr\": 0.020937505161201096,\n \"\
acc_norm\": 0.6073394495412844,\n \"acc_norm_stderr\": 0.020937505161201096\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012404,\n \"\
acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012404\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5098039215686274,\n \"acc_stderr\": 0.03508637358630572,\n \"\
acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.03508637358630572\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5021097046413502,\n \"acc_stderr\": 0.032546938018020076,\n \
\ \"acc_norm\": 0.5021097046413502,\n \"acc_norm_stderr\": 0.032546938018020076\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4798206278026906,\n\
\ \"acc_stderr\": 0.033530461674123,\n \"acc_norm\": 0.4798206278026906,\n\
\ \"acc_norm_stderr\": 0.033530461674123\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4351145038167939,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.4351145038167939,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5785123966942148,\n \"acc_stderr\": 0.045077322787750874,\n \"\
acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.045077322787750874\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4049079754601227,\n \"acc_stderr\": 0.03856672163548913,\n\
\ \"acc_norm\": 0.4049079754601227,\n \"acc_norm_stderr\": 0.03856672163548913\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976235,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976235\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5145631067961165,\n \"acc_stderr\": 0.049486373240266356,\n\
\ \"acc_norm\": 0.5145631067961165,\n \"acc_norm_stderr\": 0.049486373240266356\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6367521367521367,\n\
\ \"acc_stderr\": 0.03150712523091264,\n \"acc_norm\": 0.6367521367521367,\n\
\ \"acc_norm_stderr\": 0.03150712523091264\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5696040868454662,\n\
\ \"acc_stderr\": 0.017705868776292398,\n \"acc_norm\": 0.5696040868454662,\n\
\ \"acc_norm_stderr\": 0.017705868776292398\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4682080924855491,\n \"acc_stderr\": 0.026864624366756646,\n\
\ \"acc_norm\": 0.4682080924855491,\n \"acc_norm_stderr\": 0.026864624366756646\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2659217877094972,\n\
\ \"acc_stderr\": 0.014776765066438883,\n \"acc_norm\": 0.2659217877094972,\n\
\ \"acc_norm_stderr\": 0.014776765066438883\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.028629305194003543,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.028629305194003543\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n\
\ \"acc_stderr\": 0.02804339985821063,\n \"acc_norm\": 0.5787781350482315,\n\
\ \"acc_norm_stderr\": 0.02804339985821063\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.027801656212323667,\n\
\ \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.027801656212323667\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3404255319148936,\n \"acc_stderr\": 0.02826765748265014,\n \
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.02826765748265014\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32073011734028684,\n\
\ \"acc_stderr\": 0.011921199991782643,\n \"acc_norm\": 0.32073011734028684,\n\
\ \"acc_norm_stderr\": 0.011921199991782643\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.030343264224213528,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.030343264224213528\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.39869281045751637,\n \"acc_stderr\": 0.019808281317449848,\n \
\ \"acc_norm\": 0.39869281045751637,\n \"acc_norm_stderr\": 0.019808281317449848\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4818181818181818,\n\
\ \"acc_stderr\": 0.04785964010794917,\n \"acc_norm\": 0.4818181818181818,\n\
\ \"acc_norm_stderr\": 0.04785964010794917\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.40408163265306124,\n \"acc_stderr\": 0.0314147080258659,\n\
\ \"acc_norm\": 0.40408163265306124,\n \"acc_norm_stderr\": 0.0314147080258659\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5671641791044776,\n\
\ \"acc_stderr\": 0.03503490923673282,\n \"acc_norm\": 0.5671641791044776,\n\
\ \"acc_norm_stderr\": 0.03503490923673282\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n\
\ \"acc_stderr\": 0.03740059382029321,\n \"acc_norm\": 0.3614457831325301,\n\
\ \"acc_norm_stderr\": 0.03740059382029321\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6432748538011696,\n \"acc_stderr\": 0.03674013002860954,\n\
\ \"acc_norm\": 0.6432748538011696,\n \"acc_norm_stderr\": 0.03674013002860954\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.4191863436208715,\n\
\ \"mc2_stderr\": 0.015793546690441883\n }\n}\n```"
repo_url: https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v4-7B-QLoRA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|arc:challenge|25_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hellaswag|10_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T22:56:12.065154.parquet'
- config_name: results
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- results_2023-08-29T22:56:12.065154.parquet
- split: latest
path:
- results_2023-08-29T22:56:12.065154.parquet
---
# Dataset Card for Evaluation run of xzuyn/LLaMa-2-PeanutButter_v4-7B-QLoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v4-7B-QLoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xzuyn/LLaMa-2-PeanutButter_v4-7B-QLoRA](https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v4-7B-QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B-QLoRA",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T22:56:12.065154](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B-QLoRA/blob/main/results_2023-08-29T22%3A56%3A12.065154.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.438823528740988,
"acc_stderr": 0.035260068155448576,
"acc_norm": 0.44253606128507456,
"acc_norm_stderr": 0.035246174415990414,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.4191863436208715,
"mc2_stderr": 0.015793546690441883
},
"harness|arc:challenge|25": {
"acc": 0.4880546075085324,
"acc_stderr": 0.014607220340597171,
"acc_norm": 0.5204778156996587,
"acc_norm_stderr": 0.01459913135303501
},
"harness|hellaswag|10": {
"acc": 0.6022704640509858,
"acc_stderr": 0.004884287515461491,
"acc_norm": 0.788886675960964,
"acc_norm_stderr": 0.004072645874992222
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.43018867924528303,
"acc_stderr": 0.030471445867183235,
"acc_norm": 0.43018867924528303,
"acc_norm_stderr": 0.030471445867183235
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179964,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179964
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3872340425531915,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.3872340425531915,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.023752928712112147,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.023752928712112147
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276864,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276864
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.47096774193548385,
"acc_stderr": 0.028396016402761005,
"acc_norm": 0.47096774193548385,
"acc_norm_stderr": 0.028396016402761005
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.033442837442804574,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.033442837442804574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.03895658065271846,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.03895658065271846
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5404040404040404,
"acc_stderr": 0.035507024651313425,
"acc_norm": 0.5404040404040404,
"acc_norm_stderr": 0.035507024651313425
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6632124352331606,
"acc_stderr": 0.03410780251836184,
"acc_norm": 0.6632124352331606,
"acc_norm_stderr": 0.03410780251836184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4025641025641026,
"acc_stderr": 0.024864995159767755,
"acc_norm": 0.4025641025641026,
"acc_norm_stderr": 0.024864995159767755
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823019,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823019
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36554621848739494,
"acc_stderr": 0.0312821770636846,
"acc_norm": 0.36554621848739494,
"acc_norm_stderr": 0.0312821770636846
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6073394495412844,
"acc_stderr": 0.020937505161201096,
"acc_norm": 0.6073394495412844,
"acc_norm_stderr": 0.020937505161201096
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.030225226160012404,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.030225226160012404
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.03508637358630572,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.03508637358630572
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5021097046413502,
"acc_stderr": 0.032546938018020076,
"acc_norm": 0.5021097046413502,
"acc_norm_stderr": 0.032546938018020076
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4798206278026906,
"acc_stderr": 0.033530461674123,
"acc_norm": 0.4798206278026906,
"acc_norm_stderr": 0.033530461674123
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4351145038167939,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.4351145038167939,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.045077322787750874,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.045077322787750874
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4049079754601227,
"acc_stderr": 0.03856672163548913,
"acc_norm": 0.4049079754601227,
"acc_norm_stderr": 0.03856672163548913
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976235,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976235
},
"harness|hendrycksTest-management|5": {
"acc": 0.5145631067961165,
"acc_stderr": 0.049486373240266356,
"acc_norm": 0.5145631067961165,
"acc_norm_stderr": 0.049486373240266356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6367521367521367,
"acc_stderr": 0.03150712523091264,
"acc_norm": 0.6367521367521367,
"acc_norm_stderr": 0.03150712523091264
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5696040868454662,
"acc_stderr": 0.017705868776292398,
"acc_norm": 0.5696040868454662,
"acc_norm_stderr": 0.017705868776292398
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.026864624366756646,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.026864624366756646
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2659217877094972,
"acc_stderr": 0.014776765066438883,
"acc_norm": 0.2659217877094972,
"acc_norm_stderr": 0.014776765066438883
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.028629305194003543,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.028629305194003543
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5787781350482315,
"acc_stderr": 0.02804339985821063,
"acc_norm": 0.5787781350482315,
"acc_norm_stderr": 0.02804339985821063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.027801656212323667,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.027801656212323667
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.02826765748265014,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.02826765748265014
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32073011734028684,
"acc_stderr": 0.011921199991782643,
"acc_norm": 0.32073011734028684,
"acc_norm_stderr": 0.011921199991782643
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.030343264224213528,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.030343264224213528
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.39869281045751637,
"acc_stderr": 0.019808281317449848,
"acc_norm": 0.39869281045751637,
"acc_norm_stderr": 0.019808281317449848
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4818181818181818,
"acc_stderr": 0.04785964010794917,
"acc_norm": 0.4818181818181818,
"acc_norm_stderr": 0.04785964010794917
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40408163265306124,
"acc_stderr": 0.0314147080258659,
"acc_norm": 0.40408163265306124,
"acc_norm_stderr": 0.0314147080258659
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5671641791044776,
"acc_stderr": 0.03503490923673282,
"acc_norm": 0.5671641791044776,
"acc_norm_stderr": 0.03503490923673282
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.03740059382029321,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.03740059382029321
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.4191863436208715,
"mc2_stderr": 0.015793546690441883
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
EgilKarlsen/Thunderbird_RoBERTa_FT | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115576722
num_examples: 37500
- name: test
num_bytes: 38525585
num_examples: 12500
download_size: 211881891
dataset_size: 154102307
---
# Dataset Card for "Thunderbird_RoBERTa_FT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuyButter/Forklift-Person-Dataset | ---
license: apache-2.0
---
|
andersonbcdefg/lm_instruction_pairs_v2_deduped_cf | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
splits:
- name: train
num_bytes: 770709174.7697415
num_examples: 664369
download_size: 185977674
dataset_size: 770709174.7697415
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_HiTZ__alpaca-lora-65b-en-pt-es-ca | ---
pretty_name: Evaluation run of HiTZ/alpaca-lora-65b-en-pt-es-ca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [HiTZ/alpaca-lora-65b-en-pt-es-ca](https://huggingface.co/HiTZ/alpaca-lora-65b-en-pt-es-ca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HiTZ__alpaca-lora-65b-en-pt-es-ca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T03:43:44.241616](https://huggingface.co/datasets/open-llm-leaderboard/details_HiTZ__alpaca-lora-65b-en-pt-es-ca/blob/main/results_2023-09-17T03-43-44.241616.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.44924496644295303,\n\
\ \"em_stderr\": 0.005094018275255409,\n \"f1\": 0.4984060402684574,\n\
\ \"f1_stderr\": 0.004892652635239537,\n \"acc\": 0.5359600711595986,\n\
\ \"acc_stderr\": 0.011658939983913114\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.44924496644295303,\n \"em_stderr\": 0.005094018275255409,\n\
\ \"f1\": 0.4984060402684574,\n \"f1_stderr\": 0.004892652635239537\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.266868840030326,\n \
\ \"acc_stderr\": 0.012183780551887955\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938275\n\
\ }\n}\n```"
repo_url: https://huggingface.co/HiTZ/alpaca-lora-65b-en-pt-es-ca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|arc:challenge|25_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T03_43_44.241616
path:
- '**/details_harness|drop|3_2023-09-17T03-43-44.241616.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T03-43-44.241616.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T03_43_44.241616
path:
- '**/details_harness|gsm8k|5_2023-09-17T03-43-44.241616.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T03-43-44.241616.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hellaswag|10_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-04T23:39:25.347647.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-04T23:39:25.347647.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-04T23:39:25.347647.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T03_43_44.241616
path:
- '**/details_harness|winogrande|5_2023-09-17T03-43-44.241616.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T03-43-44.241616.parquet'
- config_name: results
data_files:
- split: 2023_08_04T23_39_25.347647
path:
- results_2023-08-04T23:39:25.347647.parquet
- split: 2023_09_17T03_43_44.241616
path:
- results_2023-09-17T03-43-44.241616.parquet
- split: latest
path:
- results_2023-09-17T03-43-44.241616.parquet
---
# Dataset Card for Evaluation run of HiTZ/alpaca-lora-65b-en-pt-es-ca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/HiTZ/alpaca-lora-65b-en-pt-es-ca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [HiTZ/alpaca-lora-65b-en-pt-es-ca](https://huggingface.co/HiTZ/alpaca-lora-65b-en-pt-es-ca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HiTZ__alpaca-lora-65b-en-pt-es-ca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T03:43:44.241616](https://huggingface.co/datasets/open-llm-leaderboard/details_HiTZ__alpaca-lora-65b-en-pt-es-ca/blob/main/results_2023-09-17T03-43-44.241616.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.44924496644295303,
"em_stderr": 0.005094018275255409,
"f1": 0.4984060402684574,
"f1_stderr": 0.004892652635239537,
"acc": 0.5359600711595986,
"acc_stderr": 0.011658939983913114
},
"harness|drop|3": {
"em": 0.44924496644295303,
"em_stderr": 0.005094018275255409,
"f1": 0.4984060402684574,
"f1_stderr": 0.004892652635239537
},
"harness|gsm8k|5": {
"acc": 0.266868840030326,
"acc_stderr": 0.012183780551887955
},
"harness|winogrande|5": {
"acc": 0.8050513022888713,
"acc_stderr": 0.011134099415938275
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hschang98/github-issues | ---
language:
- en
tags:
- code
size_categories:
- 1K<n<10K
---
# Dataset Card for github-issues
<!-- Provide a quick summary of the dataset. -->
This dataset comes from github-issues of Hugging Face Datasets.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
This dataset comes from github-issues of Hugging Face Datasets.
Its url is https://github.com/huggingface/datasets/issues.
- **Language(s) (NLP): en
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
squarelike/OpenOrca-gugugo-ko | ---
language:
- ko
license: mit
task_categories:
- conversational
- text-classification
- token-classification
- table-question-answering
- question-answering
- zero-shot-classification
- summarization
- feature-extraction
- text-generation
- text2text-generation
pretty_name: OpenOrca
size_categories:
- 10M<n<100M
---

# **OpenOrca 한국어 번역 데이터셋**
[Gugugo-koen-7B-V1.1](https://huggingface.co/squarelike/Gugugo-koen-7B-V1.1)을 이용하여 [OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca)데이터셋을 번역하고 있습니다.
번역 진행상황은 아래를 참고해 주십시오.
## 진행상황
- GPT4 생성물 약 100만 개 중 약 64만 개 번역완료
- GPT3.5 생성물 약 350만 개 중 약 159만 개 번역완료
데이터셋 사용 후 출처표기는 제작자에게 큰 힘이 됩니다.
# Original dataset card: OpenOrca
## Table of Contents
- [Dataset Summary](#dataset-summary)
- [Dataset Attribution](#dataset-attribution)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Dataset Use](#dataset-use)
- [Use Cases](#use-cases)
- [Usage Caveats](#usage-caveats)
- [Getting Started](#getting-started)
<p><h1>🐋 The OpenOrca Dataset! 🐋</h1></p>

<a name="dataset-announcement"></a>
We are thrilled to announce the release of the OpenOrca dataset!
This rich collection of augmented FLAN data aligns, as best as possible, with the distributions outlined in the [Orca paper](https://arxiv.org/abs/2306.02707).
It has been instrumental in generating high-performing model checkpoints and serves as a valuable resource for all NLP researchers and developers!
# Official Models
## Mistral-7B-OpenOrca
Our [latest model](https://huggingface.co/spaces/Open-Orca/Mistral-7B-OpenOrca), the first 7B to score better overall than all previous models below 30B.
98% of Llama2-70b-chat's performance, in a completely open 7B!
## OpenOrca-Platypus2-13B
Our [third model](https://huggingface.co/Open-Orca/OpenOrca-Platypus2-13B), the first 13B model to score higher than LLaMA1-65B on the HuggingFace Leaderboard!
Released in partnership with Platypus.
## LlongOrca 7B & 13B
* Our [first 7B release](https://huggingface.co/Open-Orca/LlongOrca-7B-16k), trained on top of LLongMA2 to achieve 16,000 tokens context. #1 long context 7B model at release time, with >99% of the overall #1 model's performance.
* [LlongOrca-13B-16k](https://huggingface.co/Open-Orca/LlongOrca-13B-16k), trained on top of LLongMA2. #1 long context 13B model at release time, with >97% of the overall #1 model's performance.
## OpenOrcaxOpenChat-Preview2-13B
Our [second model](https://huggingface.co/Open-Orca/OpenOrcaxOpenChat-Preview2-13B), highlighting that we've surpassed the performance reported in the Orca paper.
Was #1 at release time, now surpassed by our own OpenOrca-Platypus2-13B.
Released in partnership with OpenChat.
## OpenOrca-Preview1-13B
[OpenOrca-Preview1-13B](https://huggingface.co/Open-Orca/OpenOrca-Preview1-13B)
This model was trained in less than a day, for <$200, with <10% of our data.
At release, it beat the current state of the art models on BigBench-Hard and AGIEval. Achieves ~60% of the improvements reported in the Orca paper.
<a name="dataset-summary"></a>
# Dataset Summary
The OpenOrca dataset is a collection of augmented [FLAN Collection data](https://arxiv.org/abs/2301.13688).
Currently ~1M GPT-4 completions, and ~3.2M GPT-3.5 completions.
It is tabularized in alignment with the distributions presented in the ORCA paper and currently represents a partial completion of the full intended dataset, with ongoing generation to expand its scope.
The data is primarily used for training and evaluation in the field of natural language processing.
<a name="dataset-attribution"></a>
# Dataset Attribution
We would like to give special recognition to the following contributors for their significant efforts and dedication:
Teknium
WingLian/Caseus
Eric Hartford
NanoBit
Pankaj
Winddude
Rohan
http://AlignmentLab.ai:
Autometa
Entropi
AtlasUnified
NeverendingToast
NanoBit
WingLian/Caseus
Also of course, as always, TheBloke, for being the backbone of the whole community.
Many thanks to NanoBit and Caseus, makers of [Axolotl](https://github.com/OpenAccess-AI-Collective/axolotl), for lending us their expertise on the platform that developed and trained manticore, minotaur, and many others!
We are welcoming sponsors or collaborators to help us build these models to the scale they deserve. Please reach out via our socials:
http://Alignmentlab.ai https://discord.gg/n9hXaBPWxx
Want to visualize our full dataset? Check out our [Nomic Atlas Map](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2).
[<img src="https://huggingface.co/Open-Orca/OpenOrca-Preview1-13B/resolve/main/OpenOrca%20Nomic%20Atlas.png" alt="Atlas Nomic Dataset Map" width="400" height="400" />](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2)
<a name="supported-tasks-and-leaderboards"></a>
# Supported Tasks and Leaderboards
This dataset supports a range of tasks including language modeling, text generation, and text augmentation.
It has been instrumental in the generation of multiple high-performing model checkpoints which have exhibited exceptional performance in our unit testing.
Further information on leaderboards will be updated as they become available.
<a name="languages"></a>
# Languages
The language of the data is primarily English.
<a name="dataset-structure"></a>
# Dataset Structure
<a name="data-instances"></a>
## Data Instances
A data instance in this dataset represents entries from the FLAN collection which have been augmented by submitting the listed question to either GPT-4 or GPT-3.5.
The response is then entered into the response field.
<a name="data-fields"></a>
## Data Fields
The fields are:
1) 'id', a unique numbered identifier which includes one of 'niv', 't0', 'cot', or 'flan' to represent which source FLAN Collection submix the 'question' is sourced from.
2) 'system_prompt', representing the System Prompt presented to the GPT-3.5 or GPT-4 API for the datapoint
3) 'question', representing a question entry as provided by the FLAN Collection
4) 'response', a response to that question received from a query to either GPT-3.5 or GPT-4.
<a name="data-splits"></a>
## Data Splits
The data is unsplit.
<a name="dataset-creation"></a>
# Dataset Creation
<a name="curation-rationale"></a>
## Curation Rationale
The dataset was created to provide a source of augmented text data for researchers and developers.
The datapoints are intended primarily to provide an enhancement of the core FLAN Collection data which relies upon the detailed step by step reasoning capabilities of GPT-3.5 and GPT-4.
This "reasoning trace" augmentation has demonstrated exceptional results, allowing a LLaMA-13B model trained with this data to rival or beat GPT-3.5 on broad sets of hard reasoning tasks which all models below 100B parameters had previously performed dramatically worse on.
<a name="source-data"></a>
## Source Data
The data is generated using techniques in alignment with the distributions outlined in the Orca paper, except as noted below:
1) There is not enough CoT data in the FLAN Collection to generate 150K zero-shot entries, as the paper purports to use.
We suspect this portion was either undocumented or misrepresented. We have used the ~75K points available.
2) We used the pre-generated FLAN Collection datasets hosted on HuggingFace under conceptofmind, e.g. [conceptofmind/flan2021](https://huggingface.co/datasets/conceptofmind/flan2021_submix_original).
These are referenced by the [official FLAN Collection repo](https://github.com/google-research/FLAN/tree/main/flan/v2) as the preferred data source.
However, these are a subset of the full FLAN Collection data, and have less than the required entries for the flan2021 and t0 submixes, by ~1.25M and 200k respectively.
Combined, this gave us ~1.5M fewer datapoints than in the original Orca paper. Completing the set is an ongoing work.
<a name="dataset-use"></a>
# Dataset Use
<a name="use-cases"></a>
## Use Cases
The dataset can be used for tasks related to language understanding, natural language processing, machine learning model training, and model performance evaluation.
<a name="usage-caveats"></a>
## Usage Caveats
Given that this is a work-in-progress dataset, it is recommended to regularly check for updates and improvements.
Further, the data should be used in accordance with the guidelines and recommendations outlined in the Orca paper.
<a name="getting-started"></a>
## Getting Started
This dataset is organized such that it can be naively loaded via Hugging Face datasets library.
We recommend using streaming due to the large size of the files.
Regular updates and data generation progress can be monitored through the OpenOrca repository on Hugging Face.
# Citation
```bibtex
@misc{OpenOrca,
title = {OpenOrca: An Open Dataset of GPT Augmented FLAN Reasoning Traces},
author = {Wing Lian and Bleys Goodson and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"},
year = {2023},
publisher = {HuggingFace},
journal = {HuggingFace repository},
howpublished = {\url{https://https://huggingface.co/Open-Orca/OpenOrca}},
}
```
```bibtex
@misc{mukherjee2023orca,
title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
year={2023},
eprint={2306.02707},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```bibtex
@misc{longpre2023flan,
title={The Flan Collection: Designing Data and Methods for Effective Instruction Tuning},
author={Shayne Longpre and Le Hou and Tu Vu and Albert Webson and Hyung Won Chung and Yi Tay and Denny Zhou and Quoc V. Le and Barret Zoph and Jason Wei and Adam Roberts},
year={2023},
eprint={2301.13688},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
```bibtex
@misc{touvron2023llama,
title={Llama 2: Open Foundation and Fine-Tuned Chat Models},
author={Hugo Touvron and Louis Martin and Kevin Stone and Peter Albert and Amjad Almahairi and Yasmine Babaei and Nikolay Bashlykov and Soumya Batra and Prajjwal Bhargava and Shruti Bhosale and Dan Bikel and Lukas Blecher and Cristian Canton Ferrer and Moya Chen and Guillem Cucurull and David Esiobu and Jude Fernandes and Jeremy Fu and Wenyin Fu and Brian Fuller and Cynthia Gao and Vedanuj Goswami and Naman Goyal and Anthony Hartshorn and Saghar Hosseini and Rui Hou and Hakan Inan and Marcin Kardas and Viktor Kerkez and Madian Khabsa and Isabel Kloumann and Artem Korenev and Punit Singh Koura and Marie-Anne Lachaux and Thibaut Lavril and Jenya Lee and Diana Liskovich and Yinghai Lu and Yuning Mao and Xavier Martinet and Todor Mihaylov and Pushkar Mishra and Igor Molybog and Yixin Nie and Andrew Poulton and Jeremy Reizenstein and Rashi Rungta and Kalyan Saladi and Alan Schelten and Ruan Silva and Eric Michael Smith and Ranjan Subramanian and Xiaoqing Ellen Tan and Binh Tang and Ross Taylor and Adina Williams and Jian Xiang Kuan and Puxin Xu and Zheng Yan and Iliyan Zarov and Yuchen Zhang and Angela Fan and Melanie Kambadur and Sharan Narang and Aurelien Rodriguez and Robert Stojnic and Sergey Edunov and Thomas Scialom},
year={2023},
eprint= arXiv 2307.09288
}
@software{touvron2023llama,
title={LLaMA: Open and Efficient Foundation Language Models},
author={Touvron, Hugo and Lavril, Thibaut and Izacard, Gautier and Martinet, Xavier and Lachaux, Marie-Anne and Lacroix, Timoth{\'e}e and Rozi{\`e}re, Baptiste and Goyal, Naman and Hambro, Eric and Azhar, Faisal and Rodriguez, Aurelien and Joulin, Armand and Grave, Edouard and Lample, Guillaume},
journal={arXiv preprint arXiv:2302.13971},
year={2023}
}
``` |
Nooon/Donate_a_cry | ---
license: mit
---
|
kuiugh/newbingto | ---
license: mit
---
|
GIZ/policy_classification | ---
configs:
- config_name: default
data_files:
- split: train
path: "policy_classification_train.json"
- split: test
path: "policy_classification_test.json"
--- |
David-Xu/raw_datasets_dolly | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 8266247
num_examples: 9489
- name: test
num_bytes: 901382
num_examples: 1055
download_size: 5779876
dataset_size: 9167629
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
yuan-sf63/word_mask_D_32 | ---
dataset_info:
features:
- name: feature
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 13882115.11177655
num_examples: 141711
- name: validation
num_bytes: 1542489.8882234516
num_examples: 15746
download_size: 11544184
dataset_size: 15424605.0
---
# Dataset Card for "word_mask_D_32"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v0.2-sp-v0 | ---
pretty_name: Evaluation run of azarafrooz/Mistral-7B-Instruct-v0.2-sp-v0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [azarafrooz/Mistral-7B-Instruct-v0.2-sp-v0](https://huggingface.co/azarafrooz/Mistral-7B-Instruct-v0.2-sp-v0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v0.2-sp-v0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T19:32:48.242732](https://huggingface.co/datasets/open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v0.2-sp-v0/blob/main/results_2024-03-09T19-32-48.242732.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.607447095635537,\n\
\ \"acc_stderr\": 0.03314052014839398,\n \"acc_norm\": 0.6119347527420224,\n\
\ \"acc_norm_stderr\": 0.033811338894945774,\n \"mc1\": 0.5287637698898409,\n\
\ \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.6822484423368418,\n\
\ \"mc2_stderr\": 0.015197767693951841\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522085,\n\
\ \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491888\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6681935869348735,\n\
\ \"acc_stderr\": 0.004698995789478832,\n \"acc_norm\": 0.8484365664210317,\n\
\ \"acc_norm_stderr\": 0.003578643387547847\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\
: 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.027379871229943245,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.027379871229943245\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306443,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5564102564102564,\n \"acc_stderr\": 0.025189149894764205,\n\
\ \"acc_norm\": 0.5564102564102564,\n \"acc_norm_stderr\": 0.025189149894764205\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881563,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881563\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7908256880733945,\n \"acc_stderr\": 0.017437937173343233,\n \"\
acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.017437937173343233\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145624,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145624\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.014805384478371155,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.014805384478371155\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31620111731843575,\n\
\ \"acc_stderr\": 0.015551673652172547,\n \"acc_norm\": 0.31620111731843575,\n\
\ \"acc_norm_stderr\": 0.015551673652172547\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015685,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015685\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.02548311560119546,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.02548311560119546\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43285528031290743,\n\
\ \"acc_stderr\": 0.012654565234622868,\n \"acc_norm\": 0.43285528031290743,\n\
\ \"acc_norm_stderr\": 0.012654565234622868\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n\
\ \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6339869281045751,\n \"acc_stderr\": 0.019488025745529675,\n \
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.019488025745529675\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.03115715086935557,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.03115715086935557\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5287637698898409,\n\
\ \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.6822484423368418,\n\
\ \"mc2_stderr\": 0.015197767693951841\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.01180736022402539\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40106141015921154,\n \
\ \"acc_stderr\": 0.013500158922245542\n }\n}\n```"
repo_url: https://huggingface.co/azarafrooz/Mistral-7B-Instruct-v0.2-sp-v0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|arc:challenge|25_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|gsm8k|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hellaswag|10_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-32-48.242732.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T19-32-48.242732.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- '**/details_harness|winogrande|5_2024-03-09T19-32-48.242732.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T19-32-48.242732.parquet'
- config_name: results
data_files:
- split: 2024_03_09T19_32_48.242732
path:
- results_2024-03-09T19-32-48.242732.parquet
- split: latest
path:
- results_2024-03-09T19-32-48.242732.parquet
---
# Dataset Card for Evaluation run of azarafrooz/Mistral-7B-Instruct-v0.2-sp-v0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [azarafrooz/Mistral-7B-Instruct-v0.2-sp-v0](https://huggingface.co/azarafrooz/Mistral-7B-Instruct-v0.2-sp-v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v0.2-sp-v0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T19:32:48.242732](https://huggingface.co/datasets/open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v0.2-sp-v0/blob/main/results_2024-03-09T19-32-48.242732.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.607447095635537,
"acc_stderr": 0.03314052014839398,
"acc_norm": 0.6119347527420224,
"acc_norm_stderr": 0.033811338894945774,
"mc1": 0.5287637698898409,
"mc1_stderr": 0.017474513848525518,
"mc2": 0.6822484423368418,
"mc2_stderr": 0.015197767693951841
},
"harness|arc:challenge|25": {
"acc": 0.5887372013651877,
"acc_stderr": 0.014379441068522085,
"acc_norm": 0.6305460750853242,
"acc_norm_stderr": 0.014104578366491888
},
"harness|hellaswag|10": {
"acc": 0.6681935869348735,
"acc_stderr": 0.004698995789478832,
"acc_norm": 0.8484365664210317,
"acc_norm_stderr": 0.003578643387547847
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.027379871229943245,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.027379871229943245
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306443,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5564102564102564,
"acc_stderr": 0.025189149894764205,
"acc_norm": 0.5564102564102564,
"acc_norm_stderr": 0.025189149894764205
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881563,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881563
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.017437937173343233,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.017437937173343233
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145624,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371155,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371155
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31620111731843575,
"acc_stderr": 0.015551673652172547,
"acc_norm": 0.31620111731843575,
"acc_norm_stderr": 0.015551673652172547
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.02671611838015685,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.02671611838015685
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.02548311560119546,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.02548311560119546
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622868,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622868
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.019488025745529675,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.019488025745529675
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935557,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935557
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333047,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333047
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5287637698898409,
"mc1_stderr": 0.017474513848525518,
"mc2": 0.6822484423368418,
"mc2_stderr": 0.015197767693951841
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.01180736022402539
},
"harness|gsm8k|5": {
"acc": 0.40106141015921154,
"acc_stderr": 0.013500158922245542
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
anan-2024/twitter_dataset_1713187666 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 22133
num_examples: 48
download_size: 13275
dataset_size: 22133
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Dahoas/static-hh | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 143664651
num_examples: 96256
- name: test
num_bytes: 7649255
num_examples: 5103
download_size: 90825631
dataset_size: 151313906
---
Static split of Anthropic's Helpful Harmless dataset. Contains base-online and rejection sampled outputs. |
prerna7/resume-dataset | ---
task_categories:
- text-classification
- token-classification
language:
- en
size_categories:
- 1K<n<10K
license: openrail
--- |
FredZhang7/malicious-website-features-2.4M | ---
license: apache-2.0
task_categories:
- text-classification
- feature-extraction
- tabular-classification
language:
- 'no'
- af
- en
- et
- sw
- sv
- sq
- de
- ca
- hu
- da
- tl
- so
- fi
- fr
- cs
- hr
- cy
- es
- sl
- tr
- pl
- pt
- nl
- id
- sk
- lt
- lv
- vi
- it
- ro
- ru
- mk
- bg
- th
- ja
- ko
- multilingual
size_categories:
- 1M<n<10M
---
**Important Notice:**
- A subset of the URL dataset is from Kaggle, and the Kaggle datasets contained 10%-15% mislabelled data. See [this dicussion I opened](https://www.kaggle.com/datasets/sid321axn/malicious-urls-dataset/discussion/431505) for some false positives. I have contacted Kaggle regarding their erroneous "Usability" score calculation for these unreliable datasets.
- The feature extraction methods shown here are not robust at all in 2023, and there're even silly mistakes in 3 functions: `not_indexed_by_google`, `domain_registration_length`, and `age_of_domain`.
<br>
The *features* dataset is original, and my feature extraction method is covered in [feature_extraction.py](./feature_extraction.py).
To extract features from a website, simply passed the URL and label to `collect_data()`. The features are saved to `phishing_detection_dataset.csv` locally by default.
In the *features* dataset, there're 911,180 websites online at the time of data collection. The plots below show the regression line and correlation coefficients of 22+ features extracted and whether the URL is malicious.
If we could plot the lifespan of URLs, we could see that the oldest website has been online since Nov 7th, 2008, while the most recent phishing websites appeared as late as July 10th, 2023.
## Malicious URL Categories
- Defacement
- Malware
- Phishing
## Data Analysis
Here are two images showing the correlation coefficient and correlation of determination between predictor values and the target value `is_malicious`.


Let's exmain the correlations one by one and cross out any unreasonable or insignificant correlations.
| Variable | Justification for Crossing Out |
|-----------------------------|------------------------------------- |
| ~~redirects~~ | contracdicts previous research (as redirects increase, is_malicious tends to decrease by a little) |
| ~~not_indexed_by_google~~ | 0.00 correlation |
| ~~email_submission~~ | contracdicts previous research |
| request_url_percentage | |
| issuer | |
| certificate_age | |
| ~~url_anchor_percentage~~ | contracdicts previous research |
| ~~meta_percentage~~ | 0.00 correlation |
| script_percentage | |
| link_percentage | |
| ~~mouseover_changes~~ | contracdicts previous research & 0.00 correlation |
| ~~right_clicked_disabled~~ | contracdicts previous research & 0.00 correlation |
| ~~popup_window_has_text_field~~ | contracdicts previous research |
| ~~use_iframe~~ | contracdicts previous research |
| ~~has_suspicious_ports~~ | contracdicts previous research |
| ~~external_favicons~~ | contracdicts previous research |
| TTL (Time to Live) | |
| ip_address_count | |
| ~~TXT_record~~ | all websites had a TXT record |
| ~~check_sfh~~ | contracdicts previous research |
| count_domain_occurrences | |
| domain_registration_length | |
| abnormal_url | |
| age_of_domain | |
| page_rank_decimal | |
## Pre-training Ideas
For training, I split the classification task into two stages in anticipation of the limited availability of online phishing websites due to their short lifespan, as well as the possibility that research done on phishing is not up-to-date:
1. a small multilingual BERT model to output the confidence level of a URL being malicious to model #2, by finetuning on 2,436,727 legitimate and malicious URLs
2. (probably) LightGBM to analyze the confidence level, along with roughly 10 extracted features
This way, I can make the most out of the limited phishing websites avaliable.
## Source of the URLs
- https://moz.com/top500
- https://phishtank.org/phish_search.php?valid=y&active=y&Search=Search
- https://www.kaggle.com/datasets/siddharthkumar25/malicious-and-benign-urls
- https://www.kaggle.com/datasets/sid321axn/malicious-urls-dataset
- https://github.com/ESDAUNG/PhishDataset
- https://github.com/JPCERTCC/phishurl-list
- https://github.com/Dogino/Discord-Phishing-URLs
## Reference
- https://www.kaggle.com/datasets/akashkr/phishing-website-dataset
- https://www.kaggle.com/datasets/shashwatwork/web-page-phishing-detection-dataset
- https://www.kaggle.com/datasets/aman9d/phishing-data
## Side notes
- Cloudflare offers an [API for phishing URL scanning](https://developers.cloudflare.com/api/operations/phishing-url-information-get-results-for-a-url-scan), with a generous global rate limit of 1200 requests every 5 minutes. |
skeskinen/books3_basic_paragraphs | ---
dataset_info:
features:
- name: text
dtype: string
- name: book
dtype: string
- name: pos
dtype: float64
- name: smog_index
dtype: float64
splits:
- name: train
num_bytes: 1366299770
num_examples: 6639751
download_size: 676098743
dataset_size: 1366299770
---
# Dataset Card for "books3_basic_paragraphs"
the_pile books3, books with smog grade difficulty estimate of 6.5 or under. Split into paragraphs and filtered out most 'non-paragraphs' like titles, tables of content, etc. |
CorpuSlave/KoEn | ---
license: cc-by-nc-sa-4.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: doc_id
dtype: string
splits:
- name: train
num_bytes: 6028010433
num_examples: 32131380
download_size: 2633226968
dataset_size: 6028010433
---
|
zcahjl3/gsm8k_optimize_examples | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: CoT_example
dtype: string
- name: rationale
dtype: string
- name: rationale_embedding
sequence: float32
- name: answer_embedding
sequence: float32
- name: final_answer
dtype: string
- name: question_embedding
sequence: float32
splits:
- name: train
num_bytes: 78180937.31781079
num_examples: 7376
download_size: 80446307
dataset_size: 78180937.31781079
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Gbssreejith/newdataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 41097936.0
num_examples: 182
- name: validation
num_bytes: 4950183.0
num_examples: 21
download_size: 43803362
dataset_size: 46048119.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
TrainingDataPro/cars-video-object-tracking | ---
license: cc-by-nc-nd-4.0
task_categories:
- image-segmentation
- image-classification
language:
- en
tags:
- code
dataset_info:
features:
- name: image_id
dtype: int32
- name: image
dtype: image
- name: mask
dtype: image
- name: annotations
dtype: string
splits:
- name: train
num_bytes: 614230158
num_examples: 100
download_size: 580108296
dataset_size: 614230158
---
# Cars Tracking
The collection of overhead video frames, capturing various types of vehicles traversing a roadway. The dataset inculdes light vehicles (cars) and heavy vehicles (minivan).
# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/object-tracking?utm_source=huggingface&utm_medium=cpc&utm_campaign=cars-video-object-tracking) to discuss your requirements, learn about the price and buy the dataset.

# Data Format
Each video frame from `images` folder is paired with an `annotations.xml` file that meticulously defines the tracking of each vehicle using polygons.
These annotations not only specify the location and path of each vehicle but also differentiate between the vehicle classes:
- cars,
- minivans.
The data labeling is visualized in the `boxes` folder.
# Example of the XML-file

# Object tracking is made in accordance with your requirements.
## **[TrainingData](https://trainingdata.pro/data-market/object-tracking?utm_source=huggingface&utm_medium=cpc&utm_campaign=cars-video-object-tracking)** provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
autoevaluate/autoeval-staging-eval-project-xsum-9818ea4b-12975766 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: sshleifer/distilbart-xsum-12-6
metrics: []
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: sshleifer/distilbart-xsum-12-6
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@grapplerulrich](https://huggingface.co/grapplerulrich) for evaluating this model. |
DanilFeofilov/Feofilov2.0 | ---
license: unknown
---
|
osunlp/KBQA-Agent | ---
license: cc-by-4.0
task_categories:
- question-answering
language:
- en
size_categories:
- n<1K
---
**Introduction**
In traditional knowledge base question answering (KBQA) methods, semantic parsing plays a crucial role. It requires a semantic parser to be extensively trained on a vast dataset of labeled examples, typically consisting of question-answer or question-program pairs.
However, the rise of LLMs has shifted this paradigm. LLMs excel in learning from few (or even zero) in-context examples. They utilize natural language as a general vehicle of thought, enabling them to actively navigate and interact with KBs using auxiliary tools, without the need for training on comprehensive datasets. This advance suggests LLMs can sidestep the earlier limitations and eliminate the dependency on extensive, high-coverage training data.
Such a paradigm is usually encapsulated in the term "language agent" or "LLM agent". Existing KBQA datasets may not be ideal to evaluate this new paradigm for two reasons: 1) Many questions are single-hop queries over the KB, which fails to sufficiently challenge the capabilities of LLMs, and 2) Established KBQA benchmarks contain tens of thousands of test questions. Evaluating the most capable models like GPT-4 on so many questions would be extremely costly and often unnecessary.
As a result, we curate KBQA-Agent to offer a more targeted KBQA evaluation for language agents. KBQA-Agent contains 500 complex questions over Freebase from three existing KBQA datasets: GrailQA, ComplexWebQuestions, and GraphQuestions. To further support future research, we also provide the ground truth action sequence (i.e., tool invocations) for the language agent to take to answer each question.
**Split**
KBQA-Agent targets a training-free setting (we used a one-shot demo in our original experiments), so there is only one split of the test set.
**Dataset Structure**
- **qid:** The unique id of a question
- **s-expression:** The ground truth logical form, where we derive the ground truth actions from
- **answer:** The list of answer entities
- **question:** The input question
- **actions:** The ground truth sequence of actions, derived from the s-expression
- **entities:** The topic entities mentioned in the question
- **source:** The source of the question (e.g., GrailQA)
**Citation**
If our paper or related resources prove valuable to your research, we kindly ask for citation. Please feel free to contact us with any inquiries.
```
@article{Gu2024Middleware,
author = {Yu Gu, Yiheng Shu, Hao Yu, Xiao Liu, Yuxiao Dong, Jie Tang, Jayanth Srinivasa, Hugo Latapie, Yu Su},
title = {Middleware for LLMs: Tools Are Instrumental for Language Agents in Complex Environments},
journal = {arXiv preprint arXiv: 2402.14672},
year = {2024}
}
```
Please also cite original sources of KBQA-Agent:
**GrailQA:**
```
@inproceedings{grailqa,
author = {Yu Gu, Sue Kase, Michelle Vanni, Brian M. Sadler, Percy Liang, Xifeng Yan, Yu Su},
title = {Beyond {I.I.D.:} Three Levels of Generalization for Question Answering on Knowledge Bases},
booktitle = {WWW '21: The Web Conference 2021, Virtual Event / Ljubljana, Slovenia, April 19-23, 2021},
year = {2021}
}
```
**ComplexWebQ:**
```
@inproceedings{cwq,
author = {Alon Talmor, Jonathan Berant},
title = {The Web as a Knowledge-Base for Answering Complex Questions},
booktitle = {Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2018, New Orleans, Louisiana, USA, June 1-6, 2018, Volume 1 (Long Papers)},
year = {2018}
}
```
**GraphQuestions:**
```
@inproceedings{graphq,
author = {Yu Su, Huan Sun, Brian M. Sadler, Mudhakar Srivatsa, Izzeddin Gur, Zenghui Yan, Xifeng Yan},
title = {On Generating Characteristic-rich Question Sets for QA Evaluation},
booktitle = {Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, EMNLP 2016, Austin, Texas, USA, November 1-4, 2016},
year = {2016}
}
```
|
Tverous/claim2 | ---
dataset_info:
features:
- name: uid
dtype: string
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: reason
dtype: string
- name: manipulated_claim_cleaned_amr
dtype: string
- name: pairID
dtype: string
- name: amr_penman
dtype: string
- name: amr_tokens
sequence: string
- name: amr_nodes
dtype: string
- name: amr_alignments
dtype: string
- name: amr_edges
sequence:
sequence: string
- name: fg_label
dtype: string
splits:
- name: split1
num_bytes: 53391
num_examples: 30
download_size: 46091
dataset_size: 53391
---
# Dataset Card for "claim2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
316usman/thematic5aembed | ---
dataset_info:
features:
- name: text
dtype: string
- name: thematic
dtype: string
- name: sub-thematic
dtype: string
- name: country
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 157542836
num_examples: 201924
download_size: 46333907
dataset_size: 157542836
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kaxap/pg-wikiSQL-sql-instructions-80k | ---
license: bsd-3-clause
---
Converted, cleaned and syntax-checked [SQLWiki](https://github.com/salesforce/WikiSQL/) dataset.
The datapoints containing non latin column names were removed.
Resulting SQL statements were adapted for Postgres syntax and conventions.
Each SQL statement, including `CREATE TABLE` statements were syntax checked with [pgsanity](https://github.com/markdrago/pgsanity).
# Citations
```
@article{zhongSeq2SQL2017,
author = {Victor Zhong and
Caiming Xiong and
Richard Socher},
title = {Seq2SQL: Generating Structured Queries from Natural Language using
Reinforcement Learning},
journal = {CoRR},
volume = {abs/1709.00103},
year = {2017}
}
``` |
irds/beir_hotpotqa | ---
pretty_name: '`beir/hotpotqa`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `beir/hotpotqa`
The `beir/hotpotqa` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/beir#beir/hotpotqa).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=5,233,329
- `queries` (i.e., topics); count=97,852
This dataset is used by: [`beir_hotpotqa_dev`](https://huggingface.co/datasets/irds/beir_hotpotqa_dev), [`beir_hotpotqa_test`](https://huggingface.co/datasets/irds/beir_hotpotqa_test), [`beir_hotpotqa_train`](https://huggingface.co/datasets/irds/beir_hotpotqa_train)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/beir_hotpotqa', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ..., 'title': ..., 'url': ...}
queries = load_dataset('irds/beir_hotpotqa', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Yang2018Hotpotqa,
title = "{H}otpot{QA}: A Dataset for Diverse, Explainable Multi-hop Question Answering",
author = "Yang, Zhilin and
Qi, Peng and
Zhang, Saizheng and
Bengio, Yoshua and
Cohen, William and
Salakhutdinov, Ruslan and
Manning, Christopher D.",
booktitle = "Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing",
month = oct # "-" # nov,
year = "2018",
address = "Brussels, Belgium",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/D18-1259",
doi = "10.18653/v1/D18-1259",
pages = "2369--2380"
}
@article{Thakur2021Beir,
title = "BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information Retrieval Models",
author = "Thakur, Nandan and Reimers, Nils and Rücklé, Andreas and Srivastava, Abhishek and Gurevych, Iryna",
journal= "arXiv preprint arXiv:2104.08663",
month = "4",
year = "2021",
url = "https://arxiv.org/abs/2104.08663",
}
```
|
SasnayaLetovka/rep_name | ---
dataset_info:
features:
- name: image
dtype: int64
- name: input_ids
sequence: int64
- name: attention_mask
sequence: int64
- name: token_type_ids
sequence: int64
- name: bbox
dtype: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 49248
num_examples: 3
- name: test
num_bytes: 49248
num_examples: 3
- name: val
num_bytes: 49248
num_examples: 3
download_size: 18994
dataset_size: 147744
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
---
|
ittailup/issste-gender | ---
dataset_info:
features:
- name: full_name
dtype: string
- name: sexo
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 119030272
num_examples: 2795585
download_size: 69319093
dataset_size: 119030272
---
# Dataset Card for "issste"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_wnli_non_coordinated_obj_subj | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 9342
num_examples: 48
- name: test
num_bytes: 27092
num_examples: 95
- name: train
num_bytes: 80278
num_examples: 429
download_size: 45052
dataset_size: 116712
---
# Dataset Card for "MULTI_VALUE_wnli_non_coordinated_obj_subj"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cassanof/gazzetta-self-instruct-subset1000 | ---
dataset_info:
features:
- name: text
dtype: string
- name: field1
dtype: string
- name: field2
dtype: string
- name: eiv
dtype: string
- name: about
dtype: string
- name: url
dtype: string
- name: date
dtype: string
- name: self_instructed
dtype: string
- name: riassunto
dtype: string
splits:
- name: train
num_bytes: 4623120
num_examples: 1000
download_size: 2314455
dataset_size: 4623120
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tilemachos/health_summarizetldr | ---
license: unknown
---
|
CyberHarem/ryuzaki_kaoru_theidolmastercinderellagirlsu149 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Ryūzaki Kaoru
This is the dataset of Ryūzaki Kaoru, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 436 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 436 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 436 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 436 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Ammok/laptop_price_prediction | ---
license: apache-2.0
task_categories:
- tabular-regression
language:
- en
pretty_name: laptop price prediction
size_categories:
- 1K<n<10K
--- |
phanvancongthanh/data_part04 | ---
dataset_info:
features:
- name: smiles
dtype: string
splits:
- name: train
num_bytes: 4857438660
num_examples: 103117853
download_size: 2376530922
dataset_size: 4857438660
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data_part04"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rinabuoy/Khmer-ALT-Flores-GTran-SSBIC-2Ways-Mistral-V3 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 62332640
num_examples: 150584
- name: test
num_bytes: 5474498
num_examples: 11822
download_size: 16000295
dataset_size: 67807138
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
jdreetz/medicare-faq | ---
license: unknown
---
|
freshpearYoon/vr_train_free_40 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 6530882103
num_examples: 10000
download_size: 977547378
dataset_size: 6530882103
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_chanwit__flux-base-optimized | ---
pretty_name: Evaluation run of chanwit/flux-base-optimized
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chanwit/flux-base-optimized](https://huggingface.co/chanwit/flux-base-optimized)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chanwit__flux-base-optimized\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-11T23:31:14.212913](https://huggingface.co/datasets/open-llm-leaderboard/details_chanwit__flux-base-optimized/blob/main/results_2024-02-11T23-31-14.212913.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5981649622618641,\n\
\ \"acc_stderr\": 0.0333006317784589,\n \"acc_norm\": 0.6020819933365092,\n\
\ \"acc_norm_stderr\": 0.033975467298082776,\n \"mc1\": 0.34516523867809057,\n\
\ \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.5001790307121097,\n\
\ \"mc2_stderr\": 0.015267929934854846\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.60580204778157,\n \"acc_stderr\": 0.01428052266746732,\n\
\ \"acc_norm\": 0.6544368600682594,\n \"acc_norm_stderr\": 0.013896938461145677\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6072495518820952,\n\
\ \"acc_stderr\": 0.004873640184773443,\n \"acc_norm\": 0.8173670583549094,\n\
\ \"acc_norm_stderr\": 0.0038557568514415463\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720685,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720685\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138215,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138215\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6161290322580645,\n\
\ \"acc_stderr\": 0.027666182075539635,\n \"acc_norm\": 0.6161290322580645,\n\
\ \"acc_norm_stderr\": 0.027666182075539635\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124495,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124495\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n\
\ \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871927,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871927\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217902,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217902\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489294,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489294\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n\
\ \"acc_stderr\": 0.014248873549217583,\n \"acc_norm\": 0.8020434227330779,\n\
\ \"acc_norm_stderr\": 0.014248873549217583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n\
\ \"acc_stderr\": 0.015382845587584506,\n \"acc_norm\": 0.3039106145251397,\n\
\ \"acc_norm_stderr\": 0.015382845587584506\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015685,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015685\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.027264297599804012,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.027264297599804012\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44002607561929596,\n\
\ \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.44002607561929596,\n\
\ \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.029896163033125474,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.029896163033125474\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854125,\n \
\ \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854125\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249772,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249772\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n\
\ \"acc_stderr\": 0.03265819588512699,\n \"acc_norm\": 0.6915422885572139,\n\
\ \"acc_norm_stderr\": 0.03265819588512699\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34516523867809057,\n\
\ \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.5001790307121097,\n\
\ \"mc2_stderr\": 0.015267929934854846\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.01169093380971267\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.44655041698256254,\n \
\ \"acc_stderr\": 0.01369356654974314\n }\n}\n```"
repo_url: https://huggingface.co/chanwit/flux-base-optimized
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|arc:challenge|25_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|arc:challenge|25_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|gsm8k|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|gsm8k|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hellaswag|10_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hellaswag|10_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T23-25-22.204907.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T23-31-14.212913.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T23-31-14.212913.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- '**/details_harness|winogrande|5_2024-02-11T23-25-22.204907.parquet'
- split: 2024_02_11T23_31_14.212913
path:
- '**/details_harness|winogrande|5_2024-02-11T23-31-14.212913.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-11T23-31-14.212913.parquet'
- config_name: results
data_files:
- split: 2024_02_11T23_25_22.204907
path:
- results_2024-02-11T23-25-22.204907.parquet
- split: 2024_02_11T23_31_14.212913
path:
- results_2024-02-11T23-31-14.212913.parquet
- split: latest
path:
- results_2024-02-11T23-31-14.212913.parquet
---
# Dataset Card for Evaluation run of chanwit/flux-base-optimized
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chanwit/flux-base-optimized](https://huggingface.co/chanwit/flux-base-optimized) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chanwit__flux-base-optimized",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T23:31:14.212913](https://huggingface.co/datasets/open-llm-leaderboard/details_chanwit__flux-base-optimized/blob/main/results_2024-02-11T23-31-14.212913.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5981649622618641,
"acc_stderr": 0.0333006317784589,
"acc_norm": 0.6020819933365092,
"acc_norm_stderr": 0.033975467298082776,
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.5001790307121097,
"mc2_stderr": 0.015267929934854846
},
"harness|arc:challenge|25": {
"acc": 0.60580204778157,
"acc_stderr": 0.01428052266746732,
"acc_norm": 0.6544368600682594,
"acc_norm_stderr": 0.013896938461145677
},
"harness|hellaswag|10": {
"acc": 0.6072495518820952,
"acc_stderr": 0.004873640184773443,
"acc_norm": 0.8173670583549094,
"acc_norm_stderr": 0.0038557568514415463
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138215,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138215
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6161290322580645,
"acc_stderr": 0.027666182075539635,
"acc_norm": 0.6161290322580645,
"acc_norm_stderr": 0.027666182075539635
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5948717948717949,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.5948717948717949,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871927,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871927
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.017090573804217902,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.017090573804217902
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489294,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489294
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.014248873549217583,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.014248873549217583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3039106145251397,
"acc_stderr": 0.015382845587584506,
"acc_norm": 0.3039106145251397,
"acc_norm_stderr": 0.015382845587584506
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.02671611838015685,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.02671611838015685
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.027264297599804012,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.027264297599804012
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44002607561929596,
"acc_stderr": 0.012678037478574513,
"acc_norm": 0.44002607561929596,
"acc_norm_stderr": 0.012678037478574513
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.029896163033125474,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.029896163033125474
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854125,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854125
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249772,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249772
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.03265819588512699,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.03265819588512699
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366255,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366255
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.5001790307121097,
"mc2_stderr": 0.015267929934854846
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.01169093380971267
},
"harness|gsm8k|5": {
"acc": 0.44655041698256254,
"acc_stderr": 0.01369356654974314
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
quyanh/dolly | ---
dataset_info:
features:
- name: system_prompt
dtype: string
- name: inputs
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 14079200
num_examples: 15011
download_size: 7841758
dataset_size: 14079200
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dolly"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carnival13/xlmr_int_hard_curr_trn_ep2_corr | ---
dataset_info:
features:
- name: domain_label
dtype: int64
- name: pass_label
dtype: int64
- name: input
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 285070021
num_examples: 226100
download_size: 80645458
dataset_size: 285070021
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "xlmr_int_hard_curr_trn_ep2_corr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cognitivecomputations/leet10k-alpaca | ---
license: apache-2.0
---
|
HuzaifaHPC/chest_X_ray | ---
license: openrail
---
|
kye/all-lucidrain-code-python-tokenized-65536 | ---
license: mit
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-_fil_self_1.4b_bo2_100_kl_0.1_prm_410m_thr_0.1_seed_2 | ---
dataset_info:
config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: index
dtype: int64
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43590759
num_examples: 18929
- name: epoch_1
num_bytes: 43794547
num_examples: 18929
- name: epoch_2
num_bytes: 43777667
num_examples: 18929
- name: epoch_3
num_bytes: 43724695
num_examples: 18929
- name: epoch_4
num_bytes: 43677772
num_examples: 18929
- name: epoch_5
num_bytes: 43651833
num_examples: 18929
- name: epoch_6
num_bytes: 43638979
num_examples: 18929
- name: epoch_7
num_bytes: 43620827
num_examples: 18929
- name: epoch_8
num_bytes: 43621348
num_examples: 18929
- name: epoch_9
num_bytes: 43625581
num_examples: 18929
download_size: 394487568
dataset_size: 436724008
configs:
- config_name: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
data_files:
- split: epoch_0
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/epoch_9-*
---
|
boapps/kmdb_relation_extraction | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- split: train
path: data/train-*
dataset_info:
features:
- name: paragraph
dtype: string
- name: relations
list:
- name: explanation
dtype: string
- name: object
dtype: string
- name: relation
dtype: string
- name: subject
dtype: string
splits:
- name: validation
num_bytes: 91165
num_examples: 106
- name: test
num_bytes: 86275
num_examples: 106
- name: train
num_bytes: 911376
num_examples: 1049
download_size: 702488
dataset_size: 1088816
---
# Dataset Card for "kmdb_relation_extraction"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nampdn-ai/tiny-textbooks | ---
task_categories:
- text-generation
language:
- en
pretty_name: Tiny Textbooks
size_categories:
- 100K<n<1M
license: cc-by-nc-sa-4.0
---
# Textbook-like Dataset: A High-Quality Resource for Small Language Models
The idea is simply inspired by the [Textbooks Are All You Need II: phi-1.5 technical report](https://arxiv.org/abs/2309.05463) paper. The source texts in this dataset have been gathered and carefully select the best of the [falcon-refinedweb](https://arxiv.org/abs/2306.01116) and [minipile](https://arxiv.org/abs/2304.08442) datasets to ensure the diversity, quality while tiny in size. The dataset was synthesized using 4x3090 Ti cards over a period of 500 hours, thanks to [Nous-Hermes-Llama2-13b](https://huggingface.co/NousResearch/Nous-Hermes-Llama2-13b) finetuned model.
Why settle for low-quality text when you can train on a high-quality, textbook-like dataset? Training language models on subpar text can lead to several issues:
1. **Noise**: Such text often contains typos, grammatical errors, and poorly structured sentences, which can confuse models and degrade performance.
2. **Misinformation**: Low-quality web text may contain incorrect or misleading information, leading to models propagating these inaccuracies.
3. **Lack of Depth**: Subpar text often lacks the depth and detail found in high-quality content, limiting a model's understanding of complex topics.
Conversely, training on my clean and high-quality dataset offers numerous advantages:
1. **Accuracy**: The theoretical concepts in my dataset provide near accurate and detailed information, akin to a well-written textbook. (Need more contribute for facts check)
2. **Context**: Practical examples demonstrate how these concepts apply in real-world situations, offering valuable context.
3. **Performance**: Models trained on high-quality data can generate more accurate, insightful, and human-like text.
A standout feature of this dataset is its volume. It boasts a whopping **420,000 textbook documents**. This extensive collection ensures a wide coverage of topics and concepts, providing your models with a comprehensive and diverse learning resource.
Moreover, this dataset is generated using an open-source language model, ensuring the data is open for every researcher to process. I love the openness and that's why I want to contribute this dataset for the community to push over the limit.
Quality over quantity is a principle that holds true even in machine learning. Training on a large amount of low-quality tokens can lead to models learning and propagating the noise, inaccuracies, and poor structures present in the bad text. This can result in models that generate less accurate and less coherent outputs.
On the other hand, training on a smaller amount of high-quality tokens, like those in this dataset, can yield significantly better results. High-quality tokens provide accurate, well-structured, and meaningful information from which models can learn effectively. This leads to models that can generate more accurate, insightful, and human-like text.
In essence, it's about making every token count. Each high-quality token that a model learns from is a step towards better performance. So why waste computational resources and learning capacity on bad tokens when you can focus on high-quality ones? It's a more efficient and effective approach to training language models.
Choosing high-quality dataset over low-quality web text is akin to opting for a reliable textbook over scattered internet articles. This choice can significantly enhance the performance and reliability of your causal language models.
I'm excited to present this unique blend of theoretical concepts and practical examples designed to supercharge your causal language models. This isn't just another dataset; it's a high-quality resource that can help your models learn more effectively and with better common sense.
I hope this dataset is an useful resource for ML researchers working with small causal language models. I eagerly await your feedback and suggestions as I continue to refine and expand the dataset. Together, let's push the boundaries of what's possible with a **tiny language models**!
## Visualization
[Nomic Atlas](https://atlas.nomic.ai/map/0348f3f7-9280-404f-b6d3-d0b5993a6693/846bcd82-fcc5-474d-b24b-82d1b791f80b) 230k data points visualized thanks to Nomic AI platform.
### Disclaimer
While every effort has been made to ensure the accuracy of the information contained within this dataset, please note that it is provided 'as is' and without any warranties.
The use of the `textbook` field in this dataset is intended for research purposes only. You are advised to verify any information obtained from this dataset before acting upon it.
## Tiny Series
Explore the possibilities and limitations of building Small Language Models with these tiny gems of data!
- [TinyStories](https://arxiv.org/abs/2305.07759): The paper that sparked my interest in the journey of the tiny-* series.
- [tiny-strange-textbooks](https://huggingface.co/datasets/nampdn-ai/tiny-strange-textbooks): Collection of 2,7M strange textbooks of diverse topics.
- [tiny-codes](https://huggingface.co/datasets/nampdn-ai/tiny-codes): Collection of 1.6M short and clear code snippets that can help LLM models learn how to reason.
- [tiny-math-textbooks](https://huggingface.co/datasets/nampdn-ai/tiny-math-textbooks): Collection of 635k short math textbook on various mathematical topics.
- [tiny-orca-textbooks](https://huggingface.co/datasets/nampdn-ai/tiny-orca-textbooks): Synthetic textbook to help model learn in-context on how it should perform task the right way.
- [tiny-webtext](https://huggingface.co/datasets/nampdn-ai/tiny-webtext): A 6GB (4.5M records) variety of diverse webtext enriched with critical thinking methods to make unbiased English dataset.
- [tiny-lessons](https://huggingface.co/datasets/nampdn-ai/tiny-lessons): Subset of this dataset, various lessons about "things of internet" augmented in a bite-sized textbook Markdown format.
- [tiny-bridgedict](https://huggingface.co/datasets/nampdn-ai/tiny-bridgedict): A dataset that links and transfers knowledge between English, Vietnamese, Chinese in a tiny multilingual models.
## Citation
```
@misc {nam_pham_2023,
author = { {Nam Pham} },
title = { tiny-textbooks (Revision 14de7ba) },
year = 2023,
url = { https://huggingface.co/datasets/nampdn-ai/tiny-textbooks },
doi = { 10.57967/hf/1126 },
publisher = { Hugging Face }
}
``` |
VozBonita/guilherme | ---
license: openrail
---
|
tingkart/NorwayTrivia | ---
license: apache-2.0
task_categories:
- question-answering
language:
- 'no'
tags:
- art
pretty_name: Norway Trivia
size_categories:
- 1K<n<10K
---
# Dataset Card for Norway Knowledge Dataset
### Dataset Summary
This dataset consists of question and answer pairs in the Norwegian language, covering topics related to Norway, its culture, governance, history, economy, geography, people, and international relations. Generated using OpenAI ChatGPT3.5 and Claud 2 on 09.08.2023.
### Supported Tasks and Leaderboards
- **Question Answering:** Benchmark for models to understand and respond to questions related to Norway.
- **Language Modeling:** Useful for training models in the Norwegian language with specific knowledge about Norway.
### Languages
Norwegian (Bokmål and Nynorsk).
## Dataset Structure
### Data Instances
The dataset contains pairs of questions and answers in Norwegian.
### Data Fields
- **Concept:** The broader topic under which the question falls.
- **Assistance:** The question presented to the model.
- **Text:** The corresponding answer generated by the model.
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
The dataset was curated to promote the study of Norway and to support research in Norwegian language processing.
### Source Data
#### Initial Data Collection and Normalization
Data was generated using OpenAI's ChatGPT3.5 and Claud 2 on 09.08.2023.
#### Who are the source language producers?
OpenAI and Claud 2.
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
The dataset does not include any personal or sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset serves as a rich resource for researchers and educators focusing on Norway and the Norwegian language.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
A team of researchers and linguistic experts focused on Norwegian studies.
### Licensing Information
Creative Commons Attribution 4.0 International License.
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
---
** Examples of topics :**
1. Norwegian fjords and their formation
2. Sami culture and history
3. Norway's contribution to the United Nations
4. Political structure of Norway
5. Stave churches and their architecture
6. Norwegian Nobel Committee and Peace Prize
7. Impact of oil and gas on Norway's economy
8. Norway's educational system
9. History of the Vikings in Norway
10. Norway's role in NATO
11. Traditional Norwegian cuisine
12. Norwegian literature and famous authors
13. The Svalbard Treaty
14. Immigration in Norway
15. Norway's renewable energy policies
16. Influence of Lutheranism in Norway
17. Norwegian art and famous artists
18. Norway's healthcare system
19. The Royal Family of Norway
20. Norway's relationship with the European Union
|
ittailup/lallama-data-small | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 762624244
num_examples: 100000
download_size: 412325738
dataset_size: 762624244
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "lallama-data-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Felladrin/ChatML-distilabel-intel-orca-dpo-pairs | ---
license: apache-2.0
language:
- en
size_categories:
- 10K<n<100K
---
[argilla/distilabel-intel-orca-dpo-pairs](https://huggingface.co/datasets/argilla/distilabel-intel-orca-dpo-pairs) in ChatML format, ready to use in [HuggingFace TRL's DPO Trainer](https://huggingface.co/docs/trl/main/en/dpo_trainer).
Python code used for conversion:
```python
from datasets import load_dataset
dataset = load_dataset("argilla/distilabel-intel-orca-dpo-pairs", split="train")
def format(columns):
prompt = f"<|im_start|>user\n{columns['input']}<|im_end|>\n<|im_start|>assistant\n"
if (columns['system']):
prompt = f"<|im_start|>system\n{columns['system']}<|im_end|>\n{prompt}"
return {
"prompt": prompt,
"chosen": f"{columns['chosen']}<|im_end|>",
"rejected": f"{columns['rejected']}<|im_end|>",
}
dataset.map(format).select_columns(['prompt', 'chosen', 'rejected', 'status', 'chosen_score', 'in_gsm8k_train']).to_parquet("train.parquet")
```
|
bobber/Terrier-images | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1623322.0
num_examples: 18
download_size: 1624818
dataset_size: 1623322.0
---
# Dataset Card for "Terrier-images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/mori_nozomi_seitokaiyakuindomo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Mori Nozomi (Seitokai Yakuindomo)
This is the dataset of Mori Nozomi (Seitokai Yakuindomo), containing 68 images and their tags.
The core tags of this character are `brown_hair, short_hair, brown_eyes, yellow_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 68 | 36.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mori_nozomi_seitokaiyakuindomo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 68 | 31.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mori_nozomi_seitokaiyakuindomo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 141 | 64.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mori_nozomi_seitokaiyakuindomo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 68 | 36.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mori_nozomi_seitokaiyakuindomo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 141 | 72.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mori_nozomi_seitokaiyakuindomo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mori_nozomi_seitokaiyakuindomo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, school_uniform, necktie, solo, smile, bag, single_hair_bun, profile, skirt |
| 1 | 6 |  |  |  |  |  | 1girl, necktie, school_uniform, smile, solo, ^_^ |
| 2 | 6 |  |  |  |  |  | 1girl, day, school_uniform, solo, blazer, outdoors, red_necktie, tree, hair_between_eyes, white_shirt, smile, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | school_uniform | necktie | solo | smile | bag | single_hair_bun | profile | skirt | ^_^ | day | blazer | outdoors | red_necktie | tree | hair_between_eyes | white_shirt | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:----------|:-------|:--------|:------|:------------------|:----------|:--------|:------|:------|:---------|:-----------|:--------------|:-------|:--------------------|:--------------|:-------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | | | | | X | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | | X | X | | | | | | X | X | X | X | X | X | X | X |
|
bigbio/cas |
---
language:
- fr
bigbio_language:
- French
license: other
multilinguality: monolingual
bigbio_license_shortname: DUA
pretty_name: CAS
homepage: https://clementdalloux.fr/?page_id=28
bigbio_pubmed: False
bigbio_public: False
bigbio_tasks:
- TEXT_CLASSIFICATION
---
# Dataset Card for CAS
## Dataset Description
- **Homepage:** https://clementdalloux.fr/?page_id=28
- **Pubmed:** False
- **Public:** False
- **Tasks:** TXTCLASS
We manually annotated two corpora from the biomedical field. The ESSAI corpus contains clinical trial protocols in French. They were mainly obtained from the National Cancer Institute The typical protocol consists of two parts: the summary of the trial, which indicates the purpose of the trial and the methods applied; and a detailed description of the trial with the inclusion and exclusion criteria. The CAS corpus contains clinical cases published in scientific literature and training material. They are published in different journals from French-speaking countries (France, Belgium, Switzerland, Canada, African countries, tropical countries) and are related to various medical specialties (cardiology, urology, oncology, obstetrics, pulmonology, gastro-enterology). The purpose of clinical cases is to describe clinical situations of patients. Hence, their content is close to the content of clinical narratives (description of diagnoses, treatments or procedures, evolution, family history, expected audience, etc.). In clinical cases, the negation is frequently used for describing the patient signs, symptoms, and diagnosis. Speculation is present as well but less frequently.
This version only contain the annotated CAS corpus
## Citation Information
```
@inproceedings{grabar-etal-2018-cas,
title = {{CAS}: {F}rench Corpus with Clinical Cases},
author = {Grabar, Natalia and Claveau, Vincent and Dalloux, Cl{'e}ment},
year = 2018,
month = oct,
booktitle = {
Proceedings of the Ninth International Workshop on Health Text Mining and
Information Analysis
},
publisher = {Association for Computational Linguistics},
address = {Brussels, Belgium},
pages = {122--128},
doi = {10.18653/v1/W18-5614},
url = {https://aclanthology.org/W18-5614},
abstract = {
Textual corpora are extremely important for various NLP applications as
they provide information necessary for creating, setting and testing these
applications and the corresponding tools. They are also crucial for
designing reliable methods and reproducible results. Yet, in some areas,
such as the medical area, due to confidentiality or to ethical reasons, it
is complicated and even impossible to access textual data representative of
those produced in these areas. We propose the CAS corpus built with
clinical cases, such as they are reported in the published scientific
literature in French. We describe this corpus, currently containing over
397,000 word occurrences, and the existing linguistic and semantic
annotations.
}
}
```
|
allenai/WildBench | ---
dataset_info:
features:
- name: id
dtype: int64
- name: session_id
dtype: string
- name: conversation_input
list:
- name: content
dtype: string
- name: language
dtype: string
- name: redacted
dtype: bool
- name: role
dtype: string
- name: toxic
dtype: bool
- name: references
struct:
- name: gpt-4
dtype: string
- name: checklist
sequence: string
- name: length
dtype: int64
- name: primary_tag
dtype: string
- name: secondary_tags
sequence: string
- name: intent
dtype: string
- name: appropriate
dtype: string
splits:
- name: test
num_bytes: 7418465
num_examples: 1024
download_size: 3681202
dataset_size: 7418465
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
task_categories:
- text-generation
language:
- en
pretty_name: wildbench
size_categories:
- 1K<n<10K
---
<div style="display: flex; justify-content: flex-start;"><img src="https://allenai.github.io/WildBench/wildbench_logo.png" alt="Banner" style="width: 40vw; min-width: 300px; max-width: 800px;"> </div>
# 🦁 WildBench: Benchmarking LLMs with Challenging Tasks from Real Users in the Wild
## Quick Links:
- [HF Leaderboard](https://huggingface.co/spaces/allenai/WildBench)
- [HF Dataset](https://huggingface.co/datasets/allenai/WildBench)
- [Github](https://github.com/allenai/WildBench)
## Dataset Description
- **License:** https://allenai.org/licenses/impact-lr
- **Language(s) (NLP):** English
- **Point of Contact:** [Yuchen Lin](mailto:yuchenl@allenai.org)
WildBench is a subset of [WildChat](https://huggingface.co/datasets/allenai/WildChat), which has been openly released under AI2's ImpACT license as a low-risk artifact. The use of WildChat data to cause harm is strictly prohibited.
## Data Fields
The dataset on Hugging Face is organized with several features, each of which is designed to capture specific information pertinent to the data being represented. Here is a descriptive breakdown of each feature:
- `id`: A unique identifier for each entry, represented as an integer (`int64`). Not often used.
- `session_id`: A string that uniquely identifies an example, which is usually used as id.
- `conversation_input`: A list structure that encompasses multiple attributes related to the input of the conversation:
- `content`: The actual text content of the conversation input, stored as a string.
- `language`: A string indicating the language used in the conversation input.
- `redacted`: A boolean flag (`bool`) to denote whether any part of the content has been redacted for privacy or other reasons.
- `role`: A string indicating the role of the party in the conversation (e.g., 'user', 'assistant').
- `toxic`: A boolean indicating whether the content contains any toxic elements.
- `references`: A list of dict items.
- `gpt-4`: The value is the gpt-4 generation as the assistant to the next turn.
- `checklist`: A sequence of strings that could represent a set of questions to evaluate the outputs.
- `length`: An integer (`int64`) representing the length of the conversation or content. Note that this is the number of messages.
- `primary_tag`: A string that labels the entry with a primary category.
- `secondary_tags`: A sequence of strings providing additional categorizations.
- `intent`: A string indicating the underlying intent of the conversation or the interaction instance.
- `appropriate`: A string that assesses or describes whether the conversation or content is considered appropriate, potentially in terms of content, context, or some other criteria.
### Introduction of the WildBench Leaderboard
<details open><summary style="font-size: 1.5em; font-weight: bold;"> What is WildBench? Why should I use it?</summary>
<div style="font-size: 1.2em; margin-top: 30px;">
🦁 <b>WildBench</b> is a benchmark for evaluating large language models (LLMs) on challenging tasks that are more representative of real-world applications. The examples are collected from real users by the <a href="https://wildchat.allen.ai/"><b>AI2 WildChat</b></a> project.</li>
<br>
<b>🆕 Motivation</b>: We aim to provide a more <strong>realistic</strong> and <strong>challenging</strong> benchmark for evaluating LLMs, as opposed to existing benchmarks that do not capture the <em>diversity</em> and <em>complexity</em> of <em>real-world</em> tasks.
<h2 style="color: purple">🌠 Key Features:</h2>
<ul>
<li><b style="color: purple">🌟 Fine-grained:</b>
We provide a fine-grained annotation for each example, including task types and <b>checklists</b> for evaluating the quality of responses. In addition, we use <b>length-penalized</b> Elo ratings to ensure that the quality of responses is not biased towards longer outputs.</li>
<li><b style="color: purple">🌟 Transparent & Fair: </b> We test all LLMs on the SAME set of examples, ensuring a fair evaluation. You can explore the data and see the difference between two models to analyze the concrete gap between any pair of LLMs. </li>
<li><b style="color: purple">🌟 Easy & Fast:</b> WildBench (v1.0) contains 1024 examples, and it is extremely easy to add your own LLMs to our leaderboard! 1️⃣ Let us know your model ID and suggested inference configs; 2️⃣ We'll run inference and evaluation for you; 3️⃣ Voilà! We'll notify you when your results are ready on the leaderboard.</li>
<li><b style="color: purple">🌟 Dynamic:</b> WildBench will not be a static dataset. We will continue adding new examples and updating evaluation methods. Our goal is to include new challenging examples from real users over time and provide fast yet reliable evaluations.</li>
<li><b style="color: purple">🌟 Human Verification (ongoing):</b> Although we currently use GPT-4 as the automatic evaluator, we are also collecting human preferences here (see the 🔍 🆚 Tab). We plan to update the leaderboard by incorporating human evaluations in the near future.</li>
<li><b style="color: purple">🌟 Community-driven:</b> In addition to collecting human preferences for improving our evaluation, we also welcome community users to contribute new examples they find challenging to top LLMs like GPT-4/Claude3. Any feedback and suggestions are welcome, and we'll do our best to upgrade our data and evaluation methods accordingly. </li>
</ul>
</div>
</details>
## Licensing Information
WildChat is made available under the [**AI2
ImpACT License - Low Risk Artifacts ("LR
Agreement")**](https://allenai.org/licenses/impact-lr)
## Citation
```bibtex
@misc{wildbench2024,
title = {WildBench: Benchmarking LLMs with Challenging Tasks from Real Users in the Wild},
author = {Bill Yuchen Lin and Khyathi Chandu and Faeze Brahman and Yuntian Deng and Abhilasha Ravichander and Valentina Pyatkin and Ronan Le Bras and Yejin Choi},
year = 2024,
url = {https://huggingface.co/spaces/allenai/WildBench},
}
``` |
sankettgorey/layouts_spanish2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 202091150.8
num_examples: 560
- name: test
num_bytes: 25309447.1
num_examples: 70
- name: validation
num_bytes: 25195273.1
num_examples: 70
download_size: 228019645
dataset_size: 252595871.0
---
# Dataset Card for "layouts_spanish2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigscience-data/roots_zh_du_reader | ---
language: zh
license: apache-2.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_zh_du_reader
# DuReader
- Dataset uid: `du_reader`
### Description
DuReader is a large-scale real-world Chinese dataset for Machine Reading Comprehension (MRC) and Question Answering (QA).
### Homepage
https://ai.baidu.com/broad/introduction?dataset=dureader
### Licensing
- copyright - all rights reserved
- apache-2.0: Apache License 2.0
Copyright 2017 Baidu.com, Inc. All Rights Reserved
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
### Speaker Locations
- China
### Sizes
- 0.1771 % of total
- 0.6194 % of zh
### BigScience processing steps
#### Filters applied to: zh
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
|
DBQ/Net.a.Porter.Product.prices.Russia | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Russia - Net-a-Porter - Product-level price list
tags:
- webscraping
- ecommerce
- Net
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: int64
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 16708490
num_examples: 41393
download_size: 5182135
dataset_size: 16708490
---
# Net-a-Porter web scraped data
## About the website
The **EMEA fashion industry**, particularly in **Russia**, has been experiencing substantial growth in online channels due to increased internet penetration and smartphone usage. A significant player in this advancement is **Net-a-Porter**. This platform belongs to the **luxury ecommerce industry**, offering a wide range of premium brands. With the shift towards digital platforms in the shopping behavior of consumers, **Net-a-porter** is making its remarkable presence. The dataset observed provides insight into their online activities, particularly the **Ecommerce product-list page (PLP) data** for Net-a-Porter in Russia. This provides key understanding into customer preferences, behavior, and potential market trends.
## Link to **dataset**
[Russia - Net-a-Porter - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Net-a-Porter%20Product-prices%20Russia/r/recjMpDIWAH8eIxfl)
|
open-llm-leaderboard/details_ewqr2130__mistral-7b-raw-sft | ---
pretty_name: Evaluation run of ewqr2130/mistral-7b-raw-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ewqr2130/mistral-7b-raw-sft](https://huggingface.co/ewqr2130/mistral-7b-raw-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__mistral-7b-raw-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T15:14:57.972449](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__mistral-7b-raw-sft/blob/main/results_2024-01-10T15-14-57.972449.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3451686041304389,\n\
\ \"acc_stderr\": 0.033177024770114395,\n \"acc_norm\": 0.34794617103590064,\n\
\ \"acc_norm_stderr\": 0.033992606612009306,\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299963,\n \"mc2\": 0.4077071941467522,\n\
\ \"mc2_stderr\": 0.014214727907656348\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.43430034129692835,\n \"acc_stderr\": 0.01448470304885736,\n\
\ \"acc_norm\": 0.47440273037542663,\n \"acc_norm_stderr\": 0.014592230885298964\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5518820952001593,\n\
\ \"acc_stderr\": 0.004962846206125493,\n \"acc_norm\": 0.7525393347938658,\n\
\ \"acc_norm_stderr\": 0.004306547156331412\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.41509433962264153,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.41509433962264153,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3402777777777778,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.3402777777777778,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n\
\ \"acc_stderr\": 0.035839017547364106,\n \"acc_norm\": 0.32947976878612717,\n\
\ \"acc_norm_stderr\": 0.035839017547364106\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.03395490020856112,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.03395490020856112\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4258064516129032,\n \"acc_stderr\": 0.0281291127091659,\n \"acc_norm\"\
: 0.4258064516129032,\n \"acc_norm_stderr\": 0.0281291127091659\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3103448275862069,\n\
\ \"acc_stderr\": 0.032550867699701024,\n \"acc_norm\": 0.3103448275862069,\n\
\ \"acc_norm_stderr\": 0.032550867699701024\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \
\ \"acc\": 0.44242424242424244,\n \"acc_stderr\": 0.03878372113711275,\n\
\ \"acc_norm\": 0.44242424242424244,\n \"acc_norm_stderr\": 0.03878372113711275\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.49222797927461137,\n \"acc_stderr\": 0.036080032255696545,\n\
\ \"acc_norm\": 0.49222797927461137,\n \"acc_norm_stderr\": 0.036080032255696545\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3384615384615385,\n \"acc_stderr\": 0.02399150050031304,\n \
\ \"acc_norm\": 0.3384615384615385,\n \"acc_norm_stderr\": 0.02399150050031304\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.031204691225150013,\n\
\ \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.031204691225150013\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3724770642201835,\n \"acc_stderr\": 0.020728368457638494,\n \"\
acc_norm\": 0.3724770642201835,\n \"acc_norm_stderr\": 0.020728368457638494\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4019607843137255,\n \"acc_stderr\": 0.034411900234824655,\n \"\
acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.034411900234824655\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3755274261603376,\n \"acc_stderr\": 0.03152256243091156,\n \
\ \"acc_norm\": 0.3755274261603376,\n \"acc_norm_stderr\": 0.03152256243091156\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.33183856502242154,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.33183856502242154,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.35877862595419846,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.35877862595419846,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.39669421487603307,\n \"acc_stderr\": 0.04465869780531009,\n \"\
acc_norm\": 0.39669421487603307,\n \"acc_norm_stderr\": 0.04465869780531009\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3611111111111111,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.3611111111111111,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3312883435582822,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.3312883435582822,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.17857142857142858,\n\
\ \"acc_stderr\": 0.036352091215778065,\n \"acc_norm\": 0.17857142857142858,\n\
\ \"acc_norm_stderr\": 0.036352091215778065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.36893203883495146,\n \"acc_stderr\": 0.04777615181156739,\n\
\ \"acc_norm\": 0.36893203883495146,\n \"acc_norm_stderr\": 0.04777615181156739\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5085470085470085,\n\
\ \"acc_stderr\": 0.0327513030009703,\n \"acc_norm\": 0.5085470085470085,\n\
\ \"acc_norm_stderr\": 0.0327513030009703\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4240102171136654,\n\
\ \"acc_stderr\": 0.017672263329084226,\n \"acc_norm\": 0.4240102171136654,\n\
\ \"acc_norm_stderr\": 0.017672263329084226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.028180596328259293,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.028180596328259293\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.34726688102893893,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.34726688102893893,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.31790123456790126,\n \"acc_stderr\": 0.02591006352824088,\n\
\ \"acc_norm\": 0.31790123456790126,\n \"acc_norm_stderr\": 0.02591006352824088\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.28368794326241137,\n \"acc_stderr\": 0.02689170942834396,\n \
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.02689170942834396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2940026075619296,\n\
\ \"acc_stderr\": 0.011636062953698604,\n \"acc_norm\": 0.2940026075619296,\n\
\ \"acc_norm_stderr\": 0.011636062953698604\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4632352941176471,\n \"acc_stderr\": 0.030290619180485687,\n\
\ \"acc_norm\": 0.4632352941176471,\n \"acc_norm_stderr\": 0.030290619180485687\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.28431372549019607,\n \"acc_stderr\": 0.018249024411207668,\n \
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.018249024411207668\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.42727272727272725,\n\
\ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.42727272727272725,\n\
\ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3877551020408163,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.3877551020408163,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.43283582089552236,\n\
\ \"acc_stderr\": 0.03503490923673281,\n \"acc_norm\": 0.43283582089552236,\n\
\ \"acc_norm_stderr\": 0.03503490923673281\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n\
\ \"acc_stderr\": 0.0374005938202932,\n \"acc_norm\": 0.3614457831325301,\n\
\ \"acc_norm_stderr\": 0.0374005938202932\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3742690058479532,\n \"acc_stderr\": 0.03711601185389481,\n\
\ \"acc_norm\": 0.3742690058479532,\n \"acc_norm_stderr\": 0.03711601185389481\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299963,\n \"mc2\": 0.4077071941467522,\n\
\ \"mc2_stderr\": 0.014214727907656348\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7300710339384373,\n \"acc_stderr\": 0.012476433372002608\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.037149355572403335,\n \
\ \"acc_stderr\": 0.005209516283073736\n }\n}\n```"
repo_url: https://huggingface.co/ewqr2130/mistral-7b-raw-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|arc:challenge|25_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|gsm8k|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hellaswag|10_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T15-14-57.972449.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T15-14-57.972449.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- '**/details_harness|winogrande|5_2024-01-10T15-14-57.972449.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T15-14-57.972449.parquet'
- config_name: results
data_files:
- split: 2024_01_10T15_14_57.972449
path:
- results_2024-01-10T15-14-57.972449.parquet
- split: latest
path:
- results_2024-01-10T15-14-57.972449.parquet
---
# Dataset Card for Evaluation run of ewqr2130/mistral-7b-raw-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ewqr2130/mistral-7b-raw-sft](https://huggingface.co/ewqr2130/mistral-7b-raw-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ewqr2130__mistral-7b-raw-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T15:14:57.972449](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__mistral-7b-raw-sft/blob/main/results_2024-01-10T15-14-57.972449.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3451686041304389,
"acc_stderr": 0.033177024770114395,
"acc_norm": 0.34794617103590064,
"acc_norm_stderr": 0.033992606612009306,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299963,
"mc2": 0.4077071941467522,
"mc2_stderr": 0.014214727907656348
},
"harness|arc:challenge|25": {
"acc": 0.43430034129692835,
"acc_stderr": 0.01448470304885736,
"acc_norm": 0.47440273037542663,
"acc_norm_stderr": 0.014592230885298964
},
"harness|hellaswag|10": {
"acc": 0.5518820952001593,
"acc_stderr": 0.004962846206125493,
"acc_norm": 0.7525393347938658,
"acc_norm_stderr": 0.004306547156331412
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.28289473684210525,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.28289473684210525,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.41509433962264153,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.41509433962264153,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3402777777777778,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.3402777777777778,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.035839017547364106,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.035839017547364106
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.03395490020856112,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.03395490020856112
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4258064516129032,
"acc_stderr": 0.0281291127091659,
"acc_norm": 0.4258064516129032,
"acc_norm_stderr": 0.0281291127091659
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.44242424242424244,
"acc_stderr": 0.03878372113711275,
"acc_norm": 0.44242424242424244,
"acc_norm_stderr": 0.03878372113711275
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.49222797927461137,
"acc_stderr": 0.036080032255696545,
"acc_norm": 0.49222797927461137,
"acc_norm_stderr": 0.036080032255696545
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3384615384615385,
"acc_stderr": 0.02399150050031304,
"acc_norm": 0.3384615384615385,
"acc_norm_stderr": 0.02399150050031304
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.031204691225150013,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.031204691225150013
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3724770642201835,
"acc_stderr": 0.020728368457638494,
"acc_norm": 0.3724770642201835,
"acc_norm_stderr": 0.020728368457638494
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.034411900234824655,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.034411900234824655
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3755274261603376,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.3755274261603376,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.33183856502242154,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.33183856502242154,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.35877862595419846,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.35877862595419846,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.39669421487603307,
"acc_stderr": 0.04465869780531009,
"acc_norm": 0.39669421487603307,
"acc_norm_stderr": 0.04465869780531009
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3312883435582822,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.3312883435582822,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.17857142857142858,
"acc_stderr": 0.036352091215778065,
"acc_norm": 0.17857142857142858,
"acc_norm_stderr": 0.036352091215778065
},
"harness|hendrycksTest-management|5": {
"acc": 0.36893203883495146,
"acc_stderr": 0.04777615181156739,
"acc_norm": 0.36893203883495146,
"acc_norm_stderr": 0.04777615181156739
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5085470085470085,
"acc_stderr": 0.0327513030009703,
"acc_norm": 0.5085470085470085,
"acc_norm_stderr": 0.0327513030009703
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4240102171136654,
"acc_stderr": 0.017672263329084226,
"acc_norm": 0.4240102171136654,
"acc_norm_stderr": 0.017672263329084226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.028180596328259293,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.028180596328259293
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.34726688102893893,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.34726688102893893,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.31790123456790126,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.31790123456790126,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.02689170942834396,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.02689170942834396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2940026075619296,
"acc_stderr": 0.011636062953698604,
"acc_norm": 0.2940026075619296,
"acc_norm_stderr": 0.011636062953698604
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4632352941176471,
"acc_stderr": 0.030290619180485687,
"acc_norm": 0.4632352941176471,
"acc_norm_stderr": 0.030290619180485687
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.018249024411207668,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.018249024411207668
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.42727272727272725,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.42727272727272725,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3877551020408163,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.3877551020408163,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.43283582089552236,
"acc_stderr": 0.03503490923673281,
"acc_norm": 0.43283582089552236,
"acc_norm_stderr": 0.03503490923673281
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.0374005938202932,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.0374005938202932
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3742690058479532,
"acc_stderr": 0.03711601185389481,
"acc_norm": 0.3742690058479532,
"acc_norm_stderr": 0.03711601185389481
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299963,
"mc2": 0.4077071941467522,
"mc2_stderr": 0.014214727907656348
},
"harness|winogrande|5": {
"acc": 0.7300710339384373,
"acc_stderr": 0.012476433372002608
},
"harness|gsm8k|5": {
"acc": 0.037149355572403335,
"acc_stderr": 0.005209516283073736
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
bjoernp/oscar2023_de_deduped | ---
task_categories:
- text-generation
language:
- de
size_categories:
- 10M<n<100M
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: meta
struct:
- name: warc_headers
struct:
- name: warc-record-id
dtype: string
- name: warc-date
dtype: string
- name: content-type
dtype: string
- name: content-length
dtype: int32
- name: warc-type
dtype: string
- name: warc-identified-content-language
dtype: string
- name: warc-refers-to
dtype: string
- name: warc-target-uri
dtype: string
- name: warc-block-digest
dtype: string
- name: identification
struct:
- name: label
dtype: string
- name: prob
dtype: float32
- name: harmful_pp
dtype: float32
- name: tlsh
dtype: string
- name: quality_warnings
sequence: string
- name: categories
sequence: string
- name: sentence_identifications
list:
- name: label
dtype: string
- name: prob
dtype: float32
splits:
- name: train
num_bytes: 382684030510
num_examples: 53172498
download_size: 80368267320
dataset_size: 382684030510
---
# Oscar 2023_01 DE Deduplicated
This is a deduplicated version of the german subset of the [23.01 OSCAR Corpus](https://github.com/ChenghaoMou/text-dedup), a large, crawled, and processed text dataset
curated by the OSCAR project (Open Super-large Crawled Aggregated coRpus).
OSCAR 23.01 is the January 2023 version of the OSCAR Corpus based on the November/December 2022 dump of Common Crawl.
While being quite similar to OSCAR 22.01, it contains several new features, including KenLM-based adult content detection, [...].
It was deduplicated using a MinHash implementation from the `text-dedup` library by `ChenghaoMou` available on [GitHub](https://github.com/ChenghaoMou/text-dedup). with the following command:
```bash
python -m text_dedup.minhash --path oscar-corpus/OSCAR-2301 --name "de" --cache_dir "../cache" --split "train" --column "text" --batch_size 10000 --output output/minhash_oscar_de_dedup
```
Find a filtered version of this dataset at [bjoernp/oscar2301_de_deduped_filtered](https://huggingface.co/datasets/bjoernp/oscar2301_de_deduped_filtered).
## Deduplication statistics
| Step | Runtime |
|---|---|
| Loading | 10.64s |
| MinHashing | 10574.02s |
| Clustering | 12187.65s |
| Filtering | 4198.70s |
| Saving | 3560.06s |
| Total | 30531.07s |
| Dataset | Number of documents |
|---|---|
| Before | 103299215 |
| After | 53172498 |
## Dataset scheme:
```json
{
"text":"English sentence\nphrase en français\n????????????", // (1)
"meta":{
"warc_headers":{ // (2)
"warc-identified-content-language":"fra,eng",
"warc-target-uri":"https://fr.wikipedia.org/wiki/...",
"warc-record-id":"<urn:uuid:29eaa920-d299-4b1d-b687-c72bd8d68116>",
"warc-type":"conversion",
"content-length":"35298", // (3)
"warc-refers-to":"<urn:uuid:39e42055-0d94-4e45-9c6c-9e7056635d64>",
"warc-block-digest":"sha1:WFH2A5WHCS2H365GIAFYQPI7UOAMFGHB", // (3)
"warc-date":"2022-11-26T09:45:47Z",
"content-type":"text/plain"
},
"identification":{ // (4)
"label":"fr",
"prob":0.8938327
},
"harmful_pp":4063.1814, // (5)
"tlsh":"tlsh:T125315FF2B6088901EEA097015DB39B4600B...", // (6)
"quality_warnings":[ // (7)
"short_sentences",
"header",
"footer"
],
"categories":[ // (8)
"examen_pix",
"liste_bu"
],
"sentence_identifications":[ // (9)
{
"label":"fr",
"prob":0.99837273
},
{
"label":"en",
"prob":0.9992377
},
null
]
}
}
```
## Licensing
(from the original OSCAR Corpus. We cannot reasonably comply with takedown requests.)
```
These data are released under this licensing scheme
We do not own any of the text from which these data has been extracted.
We license the actual packaging, the metadata and the annotations of these data under the Creative Commons CC0 license ("no rights reserved") http://creativecommons.org/publicdomain/zero/1.0/
To the extent possible under law, the OSCAR project, Inria, the Univertity of Mannheim and DFKI GmbH have waived all copyright and related or neighboring rights to OSCAR
This work is published from: France and Germany.
[[[
Should you consider that our data contains material that is owned by you and should therefore not be reproduced here, please:
* Clearly identify yourself, with detailed contact data such as an address, telephone number or email address at which you can be contacted.
* Clearly identify the copyrighted work claimed to be infringed.
* Clearly identify the material that is claimed to be infringing and information reasonably sufficient to allow us to locate the material.
We will comply to legitimate requests by removing the affected sources from the next release of the corpus.
]]]
```
## Citation
```
@ARTICLE{2022arXiv221210440J,
author = {{Jansen}, Tim and {Tong}, Yangling and {Zevallos}, Victoria and {Ortiz Suarez}, Pedro},
title = "{Perplexed by Quality: A Perplexity-based Method for Adult and Harmful Content Detection in Multilingual Heterogeneous Web Data}",
journal = {arXiv e-prints},
keywords = {Computer Science - Computation and Language},
year = 2022,
month = dec,
eid = {arXiv:2212.10440},
pages = {arXiv:2212.10440},
doi = {10.48550/arXiv.2212.10440},
archivePrefix = {arXiv},
eprint = {2212.10440},
primaryClass = {cs.CL},
adsurl = {https://ui.adsabs.harvard.edu/abs/2022arXiv221210440J},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}
@inproceedings{abadji-etal-2022-towards,
title = "Towards a Cleaner Document-Oriented Multilingual Crawled Corpus",
author = "Abadji, Julien and
Ortiz Suarez, Pedro and
Romary, Laurent and
Sagot, Beno{\^\i}t",
booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lrec-1.463",
pages = "4344--4355",
abstract = "The need for large corpora raw corpora has dramatically increased in recent years with the introduction of transfer learning and semi-supervised learning methods to Natural Language Processing. And while there have been some recent attempts to manually curate the amount of data necessary to train large language models, the main way to obtain this data is still through automatic web crawling. In this paper we take the existing multilingual web corpus OSCAR and its pipeline Ungoliant that extracts and classifies data from Common Crawl at the line level, and propose a set of improvements and automatic annotations in order to produce a new document-oriented version of OSCAR that could prove more suitable to pre-train large generative language models as well as hopefully other applications in Natural Language Processing and Digital Humanities.",
}
@inproceedings{AbadjiOrtizSuarezRomaryetal.2021,
author = {Julien Abadji and Pedro Javier Ortiz Su{\'a}rez and Laurent Romary and Beno{\^i}t Sagot},
title = {Ungoliant: An optimized pipeline for the generation of a very large-scale multilingual web corpus},
series = {Proceedings of the Workshop on Challenges in the Management of Large Corpora (CMLC-9) 2021. Limerick, 12 July 2021 (Online-Event)},
editor = {Harald L{\"u}ngen and Marc Kupietz and Piotr Bański and Adrien Barbaresi and Simon Clematide and Ines Pisetta},
publisher = {Leibniz-Institut f{\"u}r Deutsche Sprache},
address = {Mannheim},
doi = {10.14618/ids-pub-10468},
url = {https://nbn-resolving.org/urn:nbn:de:bsz:mh39-104688},
pages = {1 -- 9},
year = {2021},
abstract = {Since the introduction of large language models in Natural Language Processing, large raw corpora have played a crucial role in Computational Linguistics. However, most of these large raw corpora are either available only for English or not available to the general public due to copyright issues. Nevertheless, there are some examples of freely available multilingual corpora for training Deep Learning NLP models, such as the OSCAR and Paracrawl corpora. However, they have quality issues, especially for low-resource languages. Moreover, recreating or updating these corpora is very complex. In this work, we try to reproduce and improve the goclassy pipeline used to create the OSCAR corpus. We propose a new pipeline that is faster, modular, parameterizable, and well documented. We use it to create a corpus similar to OSCAR but larger and based on recent data. Also, unlike OSCAR, the metadata information is at the document level. We release our pipeline under an open source license and publish the corpus under a research-only license.},
language = {en}
}
@article{kreutzer-etal-2022-quality,
title = "Quality at a Glance: An Audit of Web-Crawled Multilingual Datasets",
author = {Kreutzer, Julia and
Caswell, Isaac and
Wang, Lisa and
Wahab, Ahsan and
van Esch, Daan and
Ulzii-Orshikh, Nasanbayar and
Tapo, Allahsera and
Subramani, Nishant and
Sokolov, Artem and
Sikasote, Claytone and
Setyawan, Monang and
Sarin, Supheakmungkol and
Samb, Sokhar and
Sagot, Beno{\^\i}t and
Rivera, Clara and
Rios, Annette and
Papadimitriou, Isabel and
Osei, Salomey and
Suarez, Pedro Ortiz and
Orife, Iroro and
Ogueji, Kelechi and
Rubungo, Andre Niyongabo and
Nguyen, Toan Q. and
M{\"u}ller, Mathias and
M{\"u}ller, Andr{\'e} and
Muhammad, Shamsuddeen Hassan and
Muhammad, Nanda and
Mnyakeni, Ayanda and
Mirzakhalov, Jamshidbek and
Matangira, Tapiwanashe and
Leong, Colin and
Lawson, Nze and
Kudugunta, Sneha and
Jernite, Yacine and
Jenny, Mathias and
Firat, Orhan and
Dossou, Bonaventure F. P. and
Dlamini, Sakhile and
de Silva, Nisansa and
{\c{C}}abuk Ball{\i}, Sakine and
Biderman, Stella and
Battisti, Alessia and
Baruwa, Ahmed and
Bapna, Ankur and
Baljekar, Pallavi and
Azime, Israel Abebe and
Awokoya, Ayodele and
Ataman, Duygu and
Ahia, Orevaoghene and
Ahia, Oghenefego and
Agrawal, Sweta and
Adeyemi, Mofetoluwa},
journal = "Transactions of the Association for Computational Linguistics",
volume = "10",
year = "2022",
address = "Cambridge, MA",
publisher = "MIT Press",
url = "https://aclanthology.org/2022.tacl-1.4",
doi = "10.1162/tacl_a_00447",
pages = "50--72",
abstract = "With the success of large-scale pre-training and multilingual modeling in Natural Language Processing (NLP), recent years have seen a proliferation of large, Web-mined text datasets covering hundreds of languages. We manually audit the quality of 205 language-specific corpora released with five major public datasets (CCAligned, ParaCrawl, WikiMatrix, OSCAR, mC4). Lower-resource corpora have systematic issues: At least 15 corpora have no usable text, and a significant fraction contains less than 50{\%} sentences of acceptable quality. In addition, many are mislabeled or use nonstandard/ambiguous language codes. We demonstrate that these issues are easy to detect even for non-proficient speakers, and supplement the human audit with automatic analyses. Finally, we recommend techniques to evaluate and improve multilingual corpora and discuss potential risks that come with low-quality data releases.",
}
@inproceedings{ortiz-suarez-etal-2020-monolingual,
title = "A Monolingual Approach to Contextualized Word Embeddings for Mid-Resource Languages",
author = "Ortiz Su{'a}rez, Pedro Javier and
Romary, Laurent and
Sagot, Benoit",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.acl-main.156",
pages = "1703--1714",
abstract = "We use the multilingual OSCAR corpus, extracted from Common Crawl via language classification, filtering and cleaning, to train monolingual contextualized word embeddings (ELMo) for five mid-resource languages. We then compare the performance of OSCAR-based and Wikipedia-based ELMo embeddings for these languages on the part-of-speech tagging and parsing tasks. We show that, despite the noise in the Common-Crawl-based OSCAR data, embeddings trained on OSCAR perform much better than monolingual embeddings trained on Wikipedia. They actually equal or improve the current state of the art in tagging and parsing for all five languages. In particular, they also improve over multilingual Wikipedia-based contextual embeddings (multilingual BERT), which almost always constitutes the previous state of the art, thereby showing that the benefit of a larger, more diverse corpus surpasses the cross-lingual benefit of multilingual embedding architectures.",
}
@inproceedings{OrtizSuarezSagotRomary2019,
author = {Pedro Javier {Ortiz Su{'a}rez} and Benoit Sagot and Laurent Romary},
title = {Asynchronous pipelines for processing huge corpora on medium to low resource infrastructures},
series = {Proceedings of the Workshop on Challenges in the Management of Large Corpora (CMLC-7) 2019. Cardiff, 22nd July 2019},
editor = {Piotr Bański and Adrien Barbaresi and Hanno Biber and Evelyn Breiteneder and Simon Clematide and Marc Kupietz and Harald L{"u}ngen and Caroline Iliadi},
publisher = {Leibniz-Institut f{"u}r Deutsche Sprache},
address = {Mannheim},
doi = {10.14618/ids-pub-9021},
url = {http://nbn-resolving.de/urn:nbn:de:bsz:mh39-90215},
pages = {9 -- 16},
year = {2019},
abstract = {Common Crawl is a considerably large, heterogeneous multilingual corpus comprised of crawled documents from the internet, surpassing 20TB of data and distributed as a set of more than 50 thousand plain text files where each contains many documents written in a wide variety of languages. Even though each document has a metadata block associated to it, this data lacks any information about the language in which each document is written, making it extremely difficult to use Common Crawl for monolingual applications. We propose a general, highly parallel, multithreaded pipeline to clean and classify Common Crawl by language; we specifically design it so that it runs efficiently on medium to low resource infrastructures where I/O speeds are the main constraint. We develop the pipeline so that it can be easily reapplied to any kind of heterogeneous corpus and so that it can be parameterised to a wide range of infrastructures. We also distribute a 6.3TB version of Common Crawl, filtered, classified by language, shuffled at line level in order to avoid copyright issues, and ready to be used for NLP applications.},
language = {en}
}
``` |
3funnn/en_corpora_parliament_processed | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 308719337
num_examples: 2051014
download_size: 171304144
dataset_size: 308719337
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
michaelmallari/airbnb-usa-co-denver | ---
license: mit
---
|
hmao/rule_gen_splunk | ---
dataset_info:
features:
- name: instruction
dtype: 'null'
- name: rule
dtype: 'null'
- name: software
dtype: 'null'
- name: configuration
dtype: 'null'
- name: description
dtype: 'null'
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 1376
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rule_gen_splunk"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dodosh/CodeSearchNet-py | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: code
dtype: string
- name: docstring
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 3837914
num_examples: 2000
download_size: 1740849
dataset_size: 3837914
---
# Dataset Card for "CodeSearchNet-py"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FINNUMBER/FINCH_TRAIN_ESG_NEWFORMAT | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: 'null'
- name: answer
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4089849
num_examples: 460
download_size: 2200662
dataset_size: 4089849
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_78_1713070264 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2592296
num_examples: 6646
download_size: 1294349
dataset_size: 2592296
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Tristan/olm-october-2022-tokenized-1024-no-bigscience-filters | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 79176169656.0
num_examples: 12861626
download_size: 21440888036
dataset_size: 79176169656.0
---
# Dataset Card for "olm-october-2022-tokenized-1024-no-bigscience-filters"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.