datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
2413016570yutian/vae_zip | ---
license: other
---
|
ashiyakatuka11/empathetic_dialogues_context | ---
dataset_info:
features:
- name: emotions
dtype: string
- name: prompts
dtype: string
- name: contexts
dtype: string
- name: utterances
dtype: string
- name: responses
dtype: string
splits:
- name: train
num_bytes: 32358480
num_examples: 64636
- name: val
num_bytes: 5110390
num_examples: 9308
- name: test
num_bytes: 5113744
num_examples: 8426
download_size: 15399742
dataset_size: 42582614
---
# Dataset Card for "empathetic_dialogues_context"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MatsuoDochiai/Samuel | ---
license: openrail
---
|
Hashif/sunoaiaudio2 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float32
- name: sampling_rate
dtype: int64
- name: text
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 19077107.35443038
num_examples: 63
- name: test
num_bytes: 4844979.645569621
num_examples: 16
download_size: 25172349
dataset_size: 23922087.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-eval-WillHeld__stereoset_zero-WillHeld__stereoset_zero-7a6673-2074067135 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- WillHeld/stereoset_zero
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-1b1
metrics: []
dataset_name: WillHeld/stereoset_zero
dataset_config: WillHeld--stereoset_zero
dataset_split: train
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-1b1
* Dataset: WillHeld/stereoset_zero
* Config: WillHeld--stereoset_zero
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@WillHeld](https://huggingface.co/WillHeld) for evaluating this model. |
bdvysg/test | ---
license: openrail
---
|
autoevaluate/autoeval-staging-eval-project-xsum-d2b9e56c-12525674 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP
metrics: []
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
marmofayezi/M3GenMaskFrench | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: mask
dtype: image
- name: caption
dtype: string
- name: landmark
dtype: image
- name: generated_image
dtype: image
splits:
- name: train
num_bytes: 2381006130.0
num_examples: 2998
download_size: 2002792105
dataset_size: 2381006130.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
michaelmallari/airbnb-usa-mn-twincities | ---
license: mit
---
|
jinwoos/car-shadow-dataset-3 | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: cartoonized_image
dtype: image
splits:
- name: train
num_bytes: 5069347829.05
num_examples: 1450
download_size: 5030304345
dataset_size: 5069347829.05
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Diegomejia/ds1ucb | ---
license: mit
---
|
open-llm-leaderboard/details_CobraMamba__mamba-gpt-7b-v1 | ---
pretty_name: Evaluation run of CobraMamba/mamba-gpt-7b-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CobraMamba/mamba-gpt-7b-v1](https://huggingface.co/CobraMamba/mamba-gpt-7b-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CobraMamba__mamba-gpt-7b-v1_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-09T14:34:23.926109](https://huggingface.co/datasets/open-llm-leaderboard/details_CobraMamba__mamba-gpt-7b-v1_public/blob/main/results_2023-11-09T14-34-23.926109.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6286909633628079,\n\
\ \"acc_stderr\": 0.03215522070353069,\n \"acc_norm\": 0.6377478775248846,\n\
\ \"acc_norm_stderr\": 0.032851877291432414,\n \"mc1\": 0.3084455324357405,\n\
\ \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.4634199786351567,\n\
\ \"mc2_stderr\": 0.014481061527331505,\n \"em\": 0.2679320469798658,\n\
\ \"em_stderr\": 0.004535526201164825,\n \"f1\": 0.31668204697986585,\n\
\ \"f1_stderr\": 0.004459593071277455\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.575938566552901,\n \"acc_stderr\": 0.014441889627464396,\n\
\ \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.01423587248790987\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6354311890061741,\n\
\ \"acc_stderr\": 0.004803253812881043,\n \"acc_norm\": 0.8409679346743677,\n\
\ \"acc_norm_stderr\": 0.003649585852821192\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396264,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396264\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7612903225806451,\n \"acc_stderr\": 0.02425107126220884,\n \"\
acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.02425107126220884\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175007,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.02424378399406216,\n \
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.02424378399406216\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.02925290592725198,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.02925290592725198\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.03104194130405929,\n \
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.03104194130405929\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431385,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431385\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.0345727283691767,\n \"acc_norm\"\
: 0.8264462809917356,\n \"acc_norm_stderr\": 0.0345727283691767\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.013890862162876163,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.013890862162876163\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.01442229220480884,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.01442229220480884\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153262,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153262\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.025171041915309684,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.025171041915309684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n\
\ \"acc_stderr\": 0.012700582404768221,\n \"acc_norm\": 0.44784876140808344,\n\
\ \"acc_norm_stderr\": 0.012700582404768221\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \
\ \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3084455324357405,\n\
\ \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.4634199786351567,\n\
\ \"mc2_stderr\": 0.014481061527331505\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.01141455439998773\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.2679320469798658,\n \
\ \"em_stderr\": 0.004535526201164825,\n \"f1\": 0.31668204697986585,\n\
\ \"f1_stderr\": 0.004459593071277455\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.17361637604245642,\n \"acc_stderr\": 0.01043346322125763\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CobraMamba/mamba-gpt-7b-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|arc:challenge|25_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|drop|3_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|gsm8k|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hellaswag|10_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-34-23.926109.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-09T14-34-23.926109.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- '**/details_harness|winogrande|5_2023-11-09T14-34-23.926109.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-09T14-34-23.926109.parquet'
- config_name: results
data_files:
- split: 2023_11_09T14_34_23.926109
path:
- results_2023-11-09T14-34-23.926109.parquet
- split: latest
path:
- results_2023-11-09T14-34-23.926109.parquet
---
# Dataset Card for Evaluation run of CobraMamba/mamba-gpt-7b-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CobraMamba/mamba-gpt-7b-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CobraMamba/mamba-gpt-7b-v1](https://huggingface.co/CobraMamba/mamba-gpt-7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CobraMamba__mamba-gpt-7b-v1_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-09T14:34:23.926109](https://huggingface.co/datasets/open-llm-leaderboard/details_CobraMamba__mamba-gpt-7b-v1_public/blob/main/results_2023-11-09T14-34-23.926109.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6286909633628079,
"acc_stderr": 0.03215522070353069,
"acc_norm": 0.6377478775248846,
"acc_norm_stderr": 0.032851877291432414,
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.4634199786351567,
"mc2_stderr": 0.014481061527331505,
"em": 0.2679320469798658,
"em_stderr": 0.004535526201164825,
"f1": 0.31668204697986585,
"f1_stderr": 0.004459593071277455
},
"harness|arc:challenge|25": {
"acc": 0.575938566552901,
"acc_stderr": 0.014441889627464396,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.01423587248790987
},
"harness|hellaswag|10": {
"acc": 0.6354311890061741,
"acc_stderr": 0.004803253812881043,
"acc_norm": 0.8409679346743677,
"acc_norm_stderr": 0.003649585852821192
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396264,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396264
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.02424378399406216,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.02424378399406216
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.02925290592725198,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.02925290592725198
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.03104194130405929,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.03104194130405929
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431385,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431385
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.0345727283691767,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.0345727283691767
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911901,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911901
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876163,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876163
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.01442229220480884,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.01442229220480884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153262,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153262
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.025171041915309684,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.025171041915309684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44784876140808344,
"acc_stderr": 0.012700582404768221,
"acc_norm": 0.44784876140808344,
"acc_norm_stderr": 0.012700582404768221
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.4634199786351567,
"mc2_stderr": 0.014481061527331505
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.01141455439998773
},
"harness|drop|3": {
"em": 0.2679320469798658,
"em_stderr": 0.004535526201164825,
"f1": 0.31668204697986585,
"f1_stderr": 0.004459593071277455
},
"harness|gsm8k|5": {
"acc": 0.17361637604245642,
"acc_stderr": 0.01043346322125763
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Chuka-J-Uzo/5Million_fraud_detection_dataset | ---
license: mit
---
|
open-llm-leaderboard/details_Aspik101__Vicuzard-30B-Uncensored-instruct-PL-lora_unload | ---
pretty_name: Evaluation run of Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload](https://huggingface.co/Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aspik101__Vicuzard-30B-Uncensored-instruct-PL-lora_unload\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T06:39:08.245014](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__Vicuzard-30B-Uncensored-instruct-PL-lora_unload/blob/main/results_2023-09-17T06-39-08.245014.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004404362416107382,\n\
\ \"em_stderr\": 0.0006781451620479503,\n \"f1\": 0.07459731543624175,\n\
\ \"f1_stderr\": 0.001589740038419953,\n \"acc\": 0.46844372186482186,\n\
\ \"acc_stderr\": 0.010745171507100412\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.004404362416107382,\n \"em_stderr\": 0.0006781451620479503,\n\
\ \"f1\": 0.07459731543624175,\n \"f1_stderr\": 0.001589740038419953\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15314632297194844,\n \
\ \"acc_stderr\": 0.009919728152791473\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.011570614861409348\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|arc:challenge|25_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T06_39_08.245014
path:
- '**/details_harness|drop|3_2023-09-17T06-39-08.245014.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T06-39-08.245014.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T06_39_08.245014
path:
- '**/details_harness|gsm8k|5_2023-09-17T06-39-08.245014.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T06-39-08.245014.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hellaswag|10_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:03:01.897575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T14:03:01.897575.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T14:03:01.897575.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T06_39_08.245014
path:
- '**/details_harness|winogrande|5_2023-09-17T06-39-08.245014.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T06-39-08.245014.parquet'
- config_name: results
data_files:
- split: 2023_08_09T14_03_01.897575
path:
- results_2023-08-09T14:03:01.897575.parquet
- split: 2023_09_17T06_39_08.245014
path:
- results_2023-09-17T06-39-08.245014.parquet
- split: latest
path:
- results_2023-09-17T06-39-08.245014.parquet
---
# Dataset Card for Evaluation run of Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload](https://huggingface.co/Aspik101/Vicuzard-30B-Uncensored-instruct-PL-lora_unload) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aspik101__Vicuzard-30B-Uncensored-instruct-PL-lora_unload",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T06:39:08.245014](https://huggingface.co/datasets/open-llm-leaderboard/details_Aspik101__Vicuzard-30B-Uncensored-instruct-PL-lora_unload/blob/main/results_2023-09-17T06-39-08.245014.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004404362416107382,
"em_stderr": 0.0006781451620479503,
"f1": 0.07459731543624175,
"f1_stderr": 0.001589740038419953,
"acc": 0.46844372186482186,
"acc_stderr": 0.010745171507100412
},
"harness|drop|3": {
"em": 0.004404362416107382,
"em_stderr": 0.0006781451620479503,
"f1": 0.07459731543624175,
"f1_stderr": 0.001589740038419953
},
"harness|gsm8k|5": {
"acc": 0.15314632297194844,
"acc_stderr": 0.009919728152791473
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.011570614861409348
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Multimodal-Fatima/VQAv2_sample_validation_facebook_opt_6.7b_mode_VQAv2_visclues_ns_20_open_ended | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_bs_8
num_bytes: 3172
num_examples: 20
download_size: 3386
dataset_size: 3172
---
# Dataset Card for "VQAv2_sample_validation_facebook_opt_6.7b_mode_VQAv2_visclues_ns_20_open_ended"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
minghua/PartSLIP | ---
license: openrail
---
|
manu/tok-corpus-shuffled | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: dataset_id
dtype: string
splits:
- name: train
num_bytes: 124374684398.0
num_examples: 31661477
download_size: 67451668212
dataset_size: 124374684398.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "tok-corpus-shuffled"
This is the dataset used to fit custom tokenizers. Goal is to have a ytokenizer that is good for French, Englsih and Code.
The dataset uploaded is shuffled to facilitate subsampling it for tokenizer training.
```
French
Dataset({
features: ['id', 'text', 'dataset_id'],
num_rows: 16881941
})
Code
Dataset({
features: ['id', 'text', 'dataset_id'],
num_rows: 6338566
})
English
Dataset({
features: ['text', 'id', 'dataset_id'],
num_rows: 8440970
})
Size of Concatenated: 124.0 GB
Size of French: 58.0 GB, ratio of 0.4705131689639972
Size of Code: 28.0 GB, ratio of 0.23046591420706297
Size of English: 37.0 GB, ratio of 0.29902091682893983
``` |
open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v17.1-32k | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-mistral-7b-v17.1-32k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-mistral-7b-v17.1-32k](https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v17.1-32k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v17.1-32k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-12T05:26:38.724116](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v17.1-32k/blob/main/results_2024-02-12T05-26-38.724116.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5820294540126106,\n\
\ \"acc_stderr\": 0.03338687908537487,\n \"acc_norm\": 0.5857871578259474,\n\
\ \"acc_norm_stderr\": 0.03406677498994728,\n \"mc1\": 0.38922888616891066,\n\
\ \"mc1_stderr\": 0.017068552680690328,\n \"mc2\": 0.5606214128788316,\n\
\ \"mc2_stderr\": 0.015225071278712598\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.515358361774744,\n \"acc_stderr\": 0.014604496129394908,\n\
\ \"acc_norm\": 0.5554607508532423,\n \"acc_norm_stderr\": 0.014521226405627084\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5853415654252141,\n\
\ \"acc_stderr\": 0.004916561213591288,\n \"acc_norm\": 0.7795259908384784,\n\
\ \"acc_norm_stderr\": 0.004137190475425526\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383886,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383886\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.045796394220704334,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.045796394220704334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36772486772486773,\n \"acc_stderr\": 0.02483383982556242,\n \"\
acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.02483383982556242\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n\
\ \"acc_stderr\": 0.025189006660212385,\n \"acc_norm\": 0.7322580645161291,\n\
\ \"acc_norm_stderr\": 0.025189006660212385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.025158266016868585,\n\
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.025158266016868585\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096625,\n \
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096625\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790215,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790215\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.042059539338841226,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.042059539338841226\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n\
\ \"acc_stderr\": 0.015411308769686933,\n \"acc_norm\": 0.7535121328224776,\n\
\ \"acc_norm_stderr\": 0.015411308769686933\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.02590663263101613,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.02590663263101613\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
\ \"acc_stderr\": 0.01489339173524962,\n \"acc_norm\": 0.27262569832402234,\n\
\ \"acc_norm_stderr\": 0.01489339173524962\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906504,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906504\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.02682280175950789,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.02682280175950789\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284066,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284066\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4302477183833116,\n\
\ \"acc_stderr\": 0.012645361435115226,\n \"acc_norm\": 0.4302477183833116,\n\
\ \"acc_norm_stderr\": 0.012645361435115226\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n\
\ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5849673202614379,\n \"acc_stderr\": 0.019933627776857425,\n \
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.019933627776857425\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982066,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982066\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691583,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691583\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38922888616891066,\n\
\ \"mc1_stderr\": 0.017068552680690328,\n \"mc2\": 0.5606214128788316,\n\
\ \"mc2_stderr\": 0.015225071278712598\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.749802683504341,\n \"acc_stderr\": 0.01217300964244915\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4268385140257771,\n \
\ \"acc_stderr\": 0.013624249696595226\n }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v17.1-32k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|arc:challenge|25_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|arc:challenge|25_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|gsm8k|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|gsm8k|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hellaswag|10_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hellaswag|10_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T19-10-05.948412.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T05-26-38.724116.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T05-26-38.724116.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- '**/details_harness|winogrande|5_2024-01-28T19-10-05.948412.parquet'
- split: 2024_02_12T05_26_38.724116
path:
- '**/details_harness|winogrande|5_2024-02-12T05-26-38.724116.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-12T05-26-38.724116.parquet'
- config_name: results
data_files:
- split: 2024_01_28T19_10_05.948412
path:
- results_2024-01-28T19-10-05.948412.parquet
- split: 2024_02_12T05_26_38.724116
path:
- results_2024-02-12T05-26-38.724116.parquet
- split: latest
path:
- results_2024-02-12T05-26-38.724116.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mistral-7b-v17.1-32k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-mistral-7b-v17.1-32k](https://huggingface.co/OpenBuddy/openbuddy-mistral-7b-v17.1-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v17.1-32k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T05:26:38.724116](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mistral-7b-v17.1-32k/blob/main/results_2024-02-12T05-26-38.724116.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5820294540126106,
"acc_stderr": 0.03338687908537487,
"acc_norm": 0.5857871578259474,
"acc_norm_stderr": 0.03406677498994728,
"mc1": 0.38922888616891066,
"mc1_stderr": 0.017068552680690328,
"mc2": 0.5606214128788316,
"mc2_stderr": 0.015225071278712598
},
"harness|arc:challenge|25": {
"acc": 0.515358361774744,
"acc_stderr": 0.014604496129394908,
"acc_norm": 0.5554607508532423,
"acc_norm_stderr": 0.014521226405627084
},
"harness|hellaswag|10": {
"acc": 0.5853415654252141,
"acc_stderr": 0.004916561213591288,
"acc_norm": 0.7795259908384784,
"acc_norm_stderr": 0.004137190475425526
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383886,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383886
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.045796394220704334,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.045796394220704334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.02483383982556242,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.02483383982556242
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.025189006660212385,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.025189006660212385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.025158266016868585,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.025158266016868585
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959916,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096625,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096625
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790215,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790215
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134987,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134987
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.042059539338841226,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.042059539338841226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7535121328224776,
"acc_stderr": 0.015411308769686933,
"acc_norm": 0.7535121328224776,
"acc_norm_stderr": 0.015411308769686933
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.02590663263101613,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.02590663263101613
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.01489339173524962,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.01489339173524962
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906504,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906504
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488544,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488544
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.02682280175950789,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.02682280175950789
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284066,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284066
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4302477183833116,
"acc_stderr": 0.012645361435115226,
"acc_norm": 0.4302477183833116,
"acc_norm_stderr": 0.012645361435115226
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5367647058823529,
"acc_stderr": 0.03029061918048569,
"acc_norm": 0.5367647058823529,
"acc_norm_stderr": 0.03029061918048569
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.019933627776857425,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.019933627776857425
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982066,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982066
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691583,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691583
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38922888616891066,
"mc1_stderr": 0.017068552680690328,
"mc2": 0.5606214128788316,
"mc2_stderr": 0.015225071278712598
},
"harness|winogrande|5": {
"acc": 0.749802683504341,
"acc_stderr": 0.01217300964244915
},
"harness|gsm8k|5": {
"acc": 0.4268385140257771,
"acc_stderr": 0.013624249696595226
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
distilled-from-one-sec-cv12/chunk_242 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 971893028
num_examples: 189379
download_size: 993095587
dataset_size: 971893028
---
# Dataset Card for "chunk_242"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-20b | ---
pretty_name: Evaluation run of h2oai/h2ogpt-oasst1-512-20b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-oasst1-512-20b](https://huggingface.co/h2oai/h2ogpt-oasst1-512-20b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-20b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-19T03:05:37.709537](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-20b/blob/main/results_2023-10-19T03-05-37.709537.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.00037786091964609505,\n \"f1\": 0.05176384228187931,\n\
\ \"f1_stderr\": 0.0012682806127954247,\n \"acc\": 0.3560947909043528,\n\
\ \"acc_stderr\": 0.008971438537963025\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964609505,\n\
\ \"f1\": 0.05176384228187931,\n \"f1_stderr\": 0.0012682806127954247\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03184230477634572,\n \
\ \"acc_stderr\": 0.004836348558260912\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6803472770323599,\n \"acc_stderr\": 0.013106528517665137\n\
\ }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-oasst1-512-20b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T03_05_37.709537
path:
- '**/details_harness|drop|3_2023-10-19T03-05-37.709537.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T03-05-37.709537.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T03_05_37.709537
path:
- '**/details_harness|gsm8k|5_2023-10-19T03-05-37.709537.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-19T03-05-37.709537.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:43:07.012781.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:43:07.012781.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:43:07.012781.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T03_05_37.709537
path:
- '**/details_harness|winogrande|5_2023-10-19T03-05-37.709537.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T03-05-37.709537.parquet'
- config_name: results
data_files:
- split: 2023_07_19T21_43_07.012781
path:
- results_2023-07-19T21:43:07.012781.parquet
- split: 2023_10_19T03_05_37.709537
path:
- results_2023-10-19T03-05-37.709537.parquet
- split: latest
path:
- results_2023-10-19T03-05-37.709537.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-oasst1-512-20b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-oasst1-512-20b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-oasst1-512-20b](https://huggingface.co/h2oai/h2ogpt-oasst1-512-20b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-20b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T03:05:37.709537](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-20b/blob/main/results_2023-10-19T03-05-37.709537.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964609505,
"f1": 0.05176384228187931,
"f1_stderr": 0.0012682806127954247,
"acc": 0.3560947909043528,
"acc_stderr": 0.008971438537963025
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964609505,
"f1": 0.05176384228187931,
"f1_stderr": 0.0012682806127954247
},
"harness|gsm8k|5": {
"acc": 0.03184230477634572,
"acc_stderr": 0.004836348558260912
},
"harness|winogrande|5": {
"acc": 0.6803472770323599,
"acc_stderr": 0.013106528517665137
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
collabora/ai4bharat-shrutilipi | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 172325190605.156
num_examples: 260806
download_size: 171129315620
dataset_size: 172325190605.156
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ashraq/hotel-reviews | ---
dataset_info:
features:
- name: review_date
dtype: string
- name: hotel_name
dtype: string
- name: review
dtype: string
splits:
- name: train
num_bytes: 15043294
num_examples: 93757
download_size: 6100544
dataset_size: 15043294
---
# Dataset Card for "hotel-reviews"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Data was obtained from [here](https://www.kaggle.com/datasets/jiashenliu/515k-hotel-reviews-data-in-europe) |
lab156/github-issues | ---
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
- name: html_url
dtype: string
- name: comments
sequence: string
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: number
dtype: int64
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 13797091
num_examples: 5599
download_size: 5345250
dataset_size: 13797091
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ArianAskari__SOLID_SFT-WoDPO-WoMixQ | ---
pretty_name: Evaluation run of ArianAskari/SOLID_SFT-WoDPO-WoMixQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ArianAskari/SOLID_SFT-WoDPO-WoMixQ](https://huggingface.co/ArianAskari/SOLID_SFT-WoDPO-WoMixQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ArianAskari__SOLID_SFT-WoDPO-WoMixQ\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-11T14:26:56.210922](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID_SFT-WoDPO-WoMixQ/blob/main/results_2024-02-11T14-26-56.210922.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5950015763685916,\n\
\ \"acc_stderr\": 0.033165134676359634,\n \"acc_norm\": 0.6045796991893356,\n\
\ \"acc_norm_stderr\": 0.03392668551306121,\n \"mc1\": 0.397796817625459,\n\
\ \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5524968462205945,\n\
\ \"mc2_stderr\": 0.01602039404250384\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5614334470989761,\n \"acc_stderr\": 0.014500682618212864,\n\
\ \"acc_norm\": 0.5964163822525598,\n \"acc_norm_stderr\": 0.014337158914268448\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6353316072495518,\n\
\ \"acc_stderr\": 0.004803533333364225,\n \"acc_norm\": 0.8168691495717985,\n\
\ \"acc_norm_stderr\": 0.003859833044230901\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490436,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490436\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365242,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365242\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.03692820767264866,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.03692820767264866\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.032662042990646775,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.032662042990646775\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997695,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997695\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164535,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164535\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.024864995159767745,\n\
\ \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.024864995159767745\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.017324352325016015,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.017324352325016015\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375798,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375798\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.045723723587374296,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.045723723587374296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281386,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281386\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n\
\ \"acc_stderr\": 0.014927447101937148,\n \"acc_norm\": 0.7752234993614304,\n\
\ \"acc_norm_stderr\": 0.014927447101937148\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242826,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242826\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n\
\ \"acc_stderr\": 0.015748421208187303,\n \"acc_norm\": 0.3318435754189944,\n\
\ \"acc_norm_stderr\": 0.015748421208187303\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.026643278474508758,\n\
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.026643278474508758\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.02651759772446501,\n\
\ \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.02651759772446501\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291484,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291484\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42633637548891784,\n\
\ \"acc_stderr\": 0.012630884771599698,\n \"acc_norm\": 0.42633637548891784,\n\
\ \"acc_norm_stderr\": 0.012630884771599698\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824866,\n\
\ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824866\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6160130718954249,\n \"acc_stderr\": 0.0196758081352815,\n \
\ \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.0196758081352815\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.397796817625459,\n\
\ \"mc1_stderr\": 0.017133934248559635,\n \"mc2\": 0.5524968462205945,\n\
\ \"mc2_stderr\": 0.01602039404250384\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7466456195737964,\n \"acc_stderr\": 0.012223754434233621\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09476876421531463,\n \
\ \"acc_stderr\": 0.008067791560015442\n }\n}\n```"
repo_url: https://huggingface.co/ArianAskari/SOLID_SFT-WoDPO-WoMixQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|arc:challenge|25_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|gsm8k|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hellaswag|10_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T14-26-56.210922.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T14-26-56.210922.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- '**/details_harness|winogrande|5_2024-02-11T14-26-56.210922.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-11T14-26-56.210922.parquet'
- config_name: results
data_files:
- split: 2024_02_11T14_26_56.210922
path:
- results_2024-02-11T14-26-56.210922.parquet
- split: latest
path:
- results_2024-02-11T14-26-56.210922.parquet
---
# Dataset Card for Evaluation run of ArianAskari/SOLID_SFT-WoDPO-WoMixQ
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ArianAskari/SOLID_SFT-WoDPO-WoMixQ](https://huggingface.co/ArianAskari/SOLID_SFT-WoDPO-WoMixQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ArianAskari__SOLID_SFT-WoDPO-WoMixQ",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T14:26:56.210922](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__SOLID_SFT-WoDPO-WoMixQ/blob/main/results_2024-02-11T14-26-56.210922.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5950015763685916,
"acc_stderr": 0.033165134676359634,
"acc_norm": 0.6045796991893356,
"acc_norm_stderr": 0.03392668551306121,
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559635,
"mc2": 0.5524968462205945,
"mc2_stderr": 0.01602039404250384
},
"harness|arc:challenge|25": {
"acc": 0.5614334470989761,
"acc_stderr": 0.014500682618212864,
"acc_norm": 0.5964163822525598,
"acc_norm_stderr": 0.014337158914268448
},
"harness|hellaswag|10": {
"acc": 0.6353316072495518,
"acc_stderr": 0.004803533333364225,
"acc_norm": 0.8168691495717985,
"acc_norm_stderr": 0.003859833044230901
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490436,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490436
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365242,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365242
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.03692820767264866,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.03692820767264866
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.032662042990646775,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.032662042990646775
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997695,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997695
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164535,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164535
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5974358974358974,
"acc_stderr": 0.024864995159767745,
"acc_norm": 0.5974358974358974,
"acc_norm_stderr": 0.024864995159767745
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.017324352325016015,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.017324352325016015
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375798,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375798
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.045723723587374296,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.045723723587374296
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281386,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281386
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.014927447101937148,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.014927447101937148
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242826,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242826
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187303,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187303
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.026643278474508758,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.026643278474508758
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291484,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291484
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42633637548891784,
"acc_stderr": 0.012630884771599698,
"acc_norm": 0.42633637548891784,
"acc_norm_stderr": 0.012630884771599698
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824866,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824866
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.0196758081352815,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.0196758081352815
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559635,
"mc2": 0.5524968462205945,
"mc2_stderr": 0.01602039404250384
},
"harness|winogrande|5": {
"acc": 0.7466456195737964,
"acc_stderr": 0.012223754434233621
},
"harness|gsm8k|5": {
"acc": 0.09476876421531463,
"acc_stderr": 0.008067791560015442
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
carles-undergrad-thesis/msmarco-corpus-en-id-parallel-sentences | ---
dataset_info:
features:
- name: text_en
dtype: string
- name: text_id
dtype: string
splits:
- name: train
num_bytes: 6084997331
num_examples: 8841823
download_size: 3258000585
dataset_size: 6084997331
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "msmarco-corpus-en-id-parallel-sentences"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arthurmluz/temario_data-wiki_cstnews_results | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 219281
num_examples: 25
download_size: 175736
dataset_size: 219281
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "temario_data-wikilingua_cstnews_results"
Results of the model arthurmluz/ptt5-wikilingua-cstnews on the dataset godoyj/temario.
'gen_summary' is the generated summary, and both bertScore and Rouge metrics calculated.
mean metrics:
rouge= {'rouge1': 0.3800757744192324, 'rouge2': 0.1539001654491066, 'rougeL': 0.2346540497659127, 'rougeLsum': 0.2346540497659127}
bert= {'precision': 0.7361391615867615, 'recall': 0.6891939973831177, 'f1': 0.711702299118042}
mover = 0.6075434818512242 |
vwxyzjn/openhermes-dev__mistralai_Mixtral-8x7B-Instruct-v0.1__1706888126 | ---
dataset_info:
features:
- name: topic
dtype: string
- name: views
dtype: 'null'
- name: system_prompt
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: 'null'
- name: title
dtype: string
- name: model_name
dtype: string
- name: id
dtype: string
- name: avatarUrl
dtype: 'null'
- name: hash
dtype: 'null'
- name: custom_instruction
dtype: bool
- name: model
dtype: 'null'
- name: idx
dtype: 'null'
- name: source
dtype: string
- name: skip_prompt_formatting
dtype: bool
- name: category
dtype: string
- name: language
dtype: string
- name: prompt
dtype: string
- name: chosen_policy
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected_policy
dtype: string
splits:
- name: train_prefs
num_bytes: 671350
num_examples: 80
- name: test_prefs
num_bytes: 9046
num_examples: 4
download_size: 432826
dataset_size: 680396
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
- split: test_prefs
path: data/test_prefs-*
---
|
marcus2000/LEYA_multilang_dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: language
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 17789031
num_examples: 6798
- name: test
num_bytes: 7175566
num_examples: 1700
download_size: 14071233
dataset_size: 24964597
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
antiven0m/anthropos-dpo | ---
license: apache-2.0
---
|
Nerfgun3/guweiz_style | ---
language:
- en
tags:
- stable-diffusion
- text-to-image
license: creativeml-openrail-m
inference: false
---
# Guweiz Artist Embedding / Textual Inversion
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
To use it in a prompt: ```"drawn by guweiz_style"```
If it is to strong just add [] around it.
Trained until 9000 steps
Have fun :)
## Example Pictures
<table>
<tr>
<td><img src=https://i.imgur.com/eCbB30e.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/U1Fezud.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/DqruJgs.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/O7VV7BS.png width=100% height=100%/></td>
<td><img src=https://i.imgur.com/k4sIsvH.png width=100% height=100%/></td>
</tr>
</table>
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
vietgpt/OSCAR-2301 | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: url
dtype: string
- name: date
dtype: string
- name: perplexity
dtype: float64
splits:
- name: train
num_bytes: 27907176803.480194
num_examples: 2918898
download_size: 10901340719
dataset_size: 27907176803.480194
---
# Dataset Card for "OSCAR-2301"
Num tokens: 4,478,799,252 tokens |
davanstrien/blbooks-parquet-embedded | ---
annotations_creators:
- no-annotation
language_creators:
- machine-generated
language:
- de
- en
- es
- fr
- it
- nl
license:
- cc0-1.0
multilinguality:
- multilingual
size_categories:
- 100K<n<1M
source_datasets: davanstrien/blbooks-parquet
task_categories:
- text-generation
- fill-mask
- other
task_ids:
- language-modeling
- masked-language-modeling
pretty_name: British Library Books
tags:
- embeddings
dataset_info:
- config_name: all
features:
- name: record_id
dtype: string
- name: date
dtype: int32
- name: raw_date
dtype: string
- name: title
dtype: string
- name: place
dtype: string
- name: empty_pg
dtype: bool
- name: text
dtype: string
- name: pg
dtype: int32
- name: mean_wc_ocr
dtype: float32
- name: std_wc_ocr
dtype: float64
- name: name
dtype: string
- name: all_names
dtype: string
- name: Publisher
dtype: string
- name: Country of publication 1
dtype: string
- name: all Countries of publication
dtype: string
- name: Physical description
dtype: string
- name: Language_1
dtype: string
- name: Language_2
dtype: string
- name: Language_3
dtype: string
- name: Language_4
dtype: string
- name: multi_language
dtype: bool
splits:
- name: train
num_bytes: 30394267732
num_examples: 14011953
download_size: 10486035662
dataset_size: 30394267732
- config_name: 1800s
features:
- name: record_id
dtype: string
- name: date
dtype: int32
- name: raw_date
dtype: string
- name: title
dtype: string
- name: place
dtype: string
- name: empty_pg
dtype: bool
- name: text
dtype: string
- name: pg
dtype: int32
- name: mean_wc_ocr
dtype: float32
- name: std_wc_ocr
dtype: float64
- name: name
dtype: string
- name: all_names
dtype: string
- name: Publisher
dtype: string
- name: Country of publication 1
dtype: string
- name: all Countries of publication
dtype: string
- name: Physical description
dtype: string
- name: Language_1
dtype: string
- name: Language_2
dtype: string
- name: Language_3
dtype: string
- name: Language_4
dtype: string
- name: multi_language
dtype: bool
splits:
- name: train
num_bytes: 30020434670
num_examples: 13781747
download_size: 10348577602
dataset_size: 30020434670
- config_name: 1700s
features:
- name: record_id
dtype: string
- name: date
dtype: int32
- name: raw_date
dtype: string
- name: title
dtype: string
- name: place
dtype: string
- name: empty_pg
dtype: bool
- name: text
dtype: string
- name: pg
dtype: int32
- name: mean_wc_ocr
dtype: float32
- name: std_wc_ocr
dtype: float64
- name: name
dtype: string
- name: all_names
dtype: string
- name: Publisher
dtype: string
- name: Country of publication 1
dtype: string
- name: all Countries of publication
dtype: string
- name: Physical description
dtype: string
- name: Language_1
dtype: string
- name: Language_2
dtype: string
- name: Language_3
dtype: string
- name: Language_4
dtype: string
- name: multi_language
dtype: bool
splits:
- name: train
num_bytes: 266382657
num_examples: 178224
download_size: 95137895
dataset_size: 266382657
- config_name: '1510_1699'
features:
- name: record_id
dtype: string
- name: date
dtype: timestamp[s]
- name: raw_date
dtype: string
- name: title
dtype: string
- name: place
dtype: string
- name: empty_pg
dtype: bool
- name: text
dtype: string
- name: pg
dtype: int32
- name: mean_wc_ocr
dtype: float32
- name: std_wc_ocr
dtype: float64
- name: name
dtype: string
- name: all_names
dtype: string
- name: Publisher
dtype: string
- name: Country of publication 1
dtype: string
- name: all Countries of publication
dtype: string
- name: Physical description
dtype: string
- name: Language_1
dtype: string
- name: Language_2
dtype: string
- name: Language_3
dtype: string
- name: Language_4
dtype: string
- name: multi_language
dtype: bool
splits:
- name: train
num_bytes: 107667469
num_examples: 51982
download_size: 42320165
dataset_size: 107667469
- config_name: '1500_1899'
features:
- name: record_id
dtype: string
- name: date
dtype: timestamp[s]
- name: raw_date
dtype: string
- name: title
dtype: string
- name: place
dtype: string
- name: empty_pg
dtype: bool
- name: text
dtype: string
- name: pg
dtype: int32
- name: mean_wc_ocr
dtype: float32
- name: std_wc_ocr
dtype: float64
- name: name
dtype: string
- name: all_names
dtype: string
- name: Publisher
dtype: string
- name: Country of publication 1
dtype: string
- name: all Countries of publication
dtype: string
- name: Physical description
dtype: string
- name: Language_1
dtype: string
- name: Language_2
dtype: string
- name: Language_3
dtype: string
- name: Language_4
dtype: string
- name: multi_language
dtype: bool
splits:
- name: train
num_bytes: 30452067039
num_examples: 14011953
download_size: 10486035662
dataset_size: 30452067039
- config_name: '1800_1899'
features:
- name: record_id
dtype: string
- name: date
dtype: timestamp[s]
- name: raw_date
dtype: string
- name: title
dtype: string
- name: place
dtype: string
- name: empty_pg
dtype: bool
- name: text
dtype: string
- name: pg
dtype: int32
- name: mean_wc_ocr
dtype: float32
- name: std_wc_ocr
dtype: float64
- name: name
dtype: string
- name: all_names
dtype: string
- name: Publisher
dtype: string
- name: Country of publication 1
dtype: string
- name: all Countries of publication
dtype: string
- name: Physical description
dtype: string
- name: Language_1
dtype: string
- name: Language_2
dtype: string
- name: Language_3
dtype: string
- name: Language_4
dtype: string
- name: multi_language
dtype: bool
splits:
- name: train
num_bytes: 30077284377
num_examples: 13781747
download_size: 10348577602
dataset_size: 30077284377
- config_name: '1700_1799'
features:
- name: record_id
dtype: string
- name: date
dtype: timestamp[s]
- name: raw_date
dtype: string
- name: title
dtype: string
- name: place
dtype: string
- name: empty_pg
dtype: bool
- name: text
dtype: string
- name: pg
dtype: int32
- name: mean_wc_ocr
dtype: float32
- name: std_wc_ocr
dtype: float64
- name: name
dtype: string
- name: all_names
dtype: string
- name: Publisher
dtype: string
- name: Country of publication 1
dtype: string
- name: all Countries of publication
dtype: string
- name: Physical description
dtype: string
- name: Language_1
dtype: string
- name: Language_2
dtype: string
- name: Language_3
dtype: string
- name: Language_4
dtype: string
- name: multi_language
dtype: bool
splits:
- name: train
num_bytes: 267117831
num_examples: 178224
download_size: 95137895
dataset_size: 267117831
---
# Dataset Card for "blbooks-parquet-embedded"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AmelieSchreiber/12M_binding_sites | ---
license: mit
---
|
pranjali97/french_translated_snli | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: translated_premise
dtype: string
- name: translated_hypothesis
dtype: string
splits:
- name: validation
num_bytes: 2303892
num_examples: 10000
- name: train
num_bytes: 122642216
num_examples: 550152
- name: test
num_bytes: 2296826
num_examples: 10000
download_size: 40424758
dataset_size: 127242934
language:
- en
- fr
---
# Dataset Card for "french_translated_snli"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jordanparker6/publaynet | ---
title: PubLayNet
license: other
annotations_creators: []
language:
- en
size_categories:
- 100B<n<1T
source_datasets: []
task_categories:
- image-to-text
task_ids: []
---
# PubLayNet
PubLayNet is a large dataset of document images, of which the layout is annotated with both bounding boxes and polygonal segmentations. The source of the documents is [PubMed Central Open Access Subset (commercial use collection)](https://www.ncbi.nlm.nih.gov/pmc/tools/openftlist/). The annotations are automatically generated by matching the PDF format and the XML format of the articles in the PubMed Central Open Access Subset. More details are available in our paper ["PubLayNet: largest dataset ever for document layout analysis."](https://arxiv.org/abs/1908.07836).
The public dataset is in tar.gz format which doesn't fit nicely with huggingface streaming. Modifications have been made to optimise the delivery of the dataset for the hugginface datset api. The original files can be found [here](https://developer.ibm.com/exchanges/data/all/publaynet/).
Licence: [Community Data License Agreement – Permissive – Version 1.0 License](https://cdla.dev/permissive-1-0/)
Author: IBM
GitHub: https://github.com/ibm-aur-nlp/PubLayNet
@article{ zhong2019publaynet,
title = { PubLayNet: largest dataset ever for document layout analysis },
author = { Zhong, Xu and Tang, Jianbin and Yepes, Antonio Jimeno },
journal = { arXiv preprint arXiv:1908.07836},
year. = { 2019 }
} |
kaesarsg/misdatos_mitre_p1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 651104
num_examples: 736
download_size: 286921
dataset_size: 651104
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hlillemark/flores200_devtest_mt5-600m-flores200-packed | ---
dataset_info:
features:
- name: id
dtype: int32
- name: source_lang
dtype: string
- name: target_lang
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: prediction
dtype: string
- name: chrf_unreduced
dtype: string
splits:
- name: devtest
num_bytes: 743880583
num_examples: 1000000
download_size: 520688518
dataset_size: 743880583
---
# Dataset Card for "flores200_devtest_mt5-600m-flores200-packed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HamdanXI/lj_speech_DifferentStructure_removedVocabs | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 22050
- name: file
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1347808036.0
num_examples: 4620
- name: test
num_bytes: 487719584.0
num_examples: 1680
download_size: 1828316030
dataset_size: 1835527620.0
---
# Dataset Card for "lj_speech_DifferentStructure_removedVocabs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/Caltech101_not_background_test_facebook_opt_350m_Attributes_Caption_ns_5647 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_16
num_bytes: 84345940.125
num_examples: 5647
- name: fewshot_1_bs_16
num_bytes: 85792356.125
num_examples: 5647
- name: fewshot_3_bs_16
num_bytes: 88692846.125
num_examples: 5647
- name: fewshot_5_bs_16
num_bytes: 91584840.125
num_examples: 5647
- name: fewshot_8_bs_16
num_bytes: 95914371.125
num_examples: 5647
download_size: 416501462
dataset_size: 446330353.625
---
# Dataset Card for "Caltech101_not_background_test_facebook_opt_350m_Attributes_Caption_ns_5647"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
samuelsze/bev_da_d_pedx_walkway_carpark_blackbackground | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 155151709.971
num_examples: 40157
download_size: 102677506
dataset_size: 155151709.971
---
# Dataset Card for "bev_da_d_pedx_walkway_carpark_blackbackground"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hac541309/the_stack_smoll_all_merged_ws | ---
license: other
---
|
intone/horror_stories_reddit | ---
task_categories:
- text-generation
- translation
language:
- en
size_categories:
- 1K<n<10K
---
# HSR <br>
HSR is a compilation of 5605 reddit posts scraped from the following subreddits:
- r/ScaryStories
- r/LetsNotMeet
- r/TwoSentenceHorror
- r/freehorrorstories
- r/TrueScaryStories
- r/NoSleep
- r/Ruleshorror
# HSR Credits
If you are using HSR, you must cite us for your project. This dataset can be used for Translation, Generative or Conversational models. <br>
Here are a few ideas that you can use HSR for: <br>
- Title-to-story
- Text Generation
- Spooky chats
|
alvarodt/falcon | ---
dataset_info:
features:
- name: timestamp
dtype: string
- name: origin_id
dtype: string
- name: origin_name
dtype: string
- name: origin_code
dtype: string
- name: destination_id
dtype: string
- name: destination_name
dtype: string
- name: destination_code
dtype: string
- name: clicks
dtype: int64
- name: plane_id
dtype: string
- name: plane_name
dtype: string
- name: use
dtype: string
splits:
- name: data
num_bytes: 512665
num_examples: 3319
download_size: 164139
dataset_size: 512665
---
# Dataset Card for "falcon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yuan-sf63/chenyu_label_0.8_72 | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
- name: '16'
dtype: int64
- name: '17'
dtype: int64
- name: '18'
dtype: int64
- name: '19'
dtype: int64
- name: '20'
dtype: int64
- name: '21'
dtype: int64
- name: '22'
dtype: int64
- name: '23'
dtype: int64
- name: '24'
dtype: int64
- name: '25'
dtype: int64
- name: '26'
dtype: int64
- name: '27'
dtype: int64
- name: '28'
dtype: int64
- name: '29'
dtype: int64
- name: '30'
dtype: int64
- name: '31'
dtype: int64
- name: '32'
dtype: int64
- name: '33'
dtype: int64
- name: '34'
dtype: int64
- name: '35'
dtype: int64
- name: '36'
dtype: int64
- name: '37'
dtype: int64
- name: '38'
dtype: int64
- name: '39'
dtype: int64
- name: '40'
dtype: int64
- name: '41'
dtype: int64
- name: '42'
dtype: int64
- name: '43'
dtype: int64
- name: '44'
dtype: int64
- name: '45'
dtype: int64
- name: '46'
dtype: int64
- name: '47'
dtype: int64
- name: '48'
dtype: int64
- name: '49'
dtype: int64
- name: '50'
dtype: int64
- name: '51'
dtype: int64
- name: '52'
dtype: int64
- name: '53'
dtype: int64
- name: '54'
dtype: int64
- name: '55'
dtype: int64
- name: '56'
dtype: int64
- name: '57'
dtype: int64
- name: '58'
dtype: int64
- name: '59'
dtype: int64
- name: '60'
dtype: int64
- name: '61'
dtype: int64
- name: '62'
dtype: int64
- name: '63'
dtype: int64
- name: '64'
dtype: int64
- name: '65'
dtype: int64
- name: '66'
dtype: int64
- name: '67'
dtype: int64
- name: '68'
dtype: int64
- name: '69'
dtype: int64
- name: '70'
dtype: int64
- name: '71'
dtype: int64
splits:
- name: train
num_bytes: 26831739.356959388
num_examples: 38893
- name: validation
num_bytes: 2981687.643040611
num_examples: 4322
download_size: 0
dataset_size: 29813427.0
---
# Dataset Card for "chenyu_label_0.8_72"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Adminhuggingface/AppScan_llama2_7b_fine_tuning_complete_dataset_v1.csv | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 52184
num_examples: 80
- name: test
num_bytes: 13207
num_examples: 20
download_size: 16303
dataset_size: 65391
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-eval-futin__random-en-805a17-2021966770 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/random
eval_info:
task: text_zero_shot_classification
model: facebook/opt-1.3b
metrics: []
dataset_name: futin/random
dataset_config: en
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-1.3b
* Dataset: futin/random
* Config: en
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
Multimodal-Fatima/FGVC_Aircraft_test_facebook_opt_1.3b_Visclues_ns_3333_random | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 300686978.375
num_examples: 3333
- name: fewshot_3_bs_16
num_bytes: 302944359.375
num_examples: 3333
download_size: 595746699
dataset_size: 603631337.75
---
# Dataset Card for "FGVC_Aircraft_test_facebook_opt_1.3b_Visclues_ns_3333_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nikniksen/shawgpt-youtube-comments | ---
license: mit
size_categories:
- n<1K
dataset_info:
features:
- name: example
dtype: string
splits:
- name: train
num_bytes: 42370
num_examples: 50
- name: test
num_bytes: 8353
num_examples: 9
download_size: 26098
dataset_size: 50723
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
Dataset for ShawGPT, a fine-tuned data science YouTube comment responder.
Video link: *coming soon!* <br>
Blog link: *coming soon!* |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/64b0981a | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1341
dataset_size: 178
---
# Dataset Card for "64b0981a"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
evelinamorim/buscape-reviews | ---
license: unknown
language:
- pt
---
The Corpus Buscapé is a large corpus of Portuguese product reviews crawled in 2013 with more than 80,000 samples from the Buscapé, a product and price search website. Unlike the datasets above, the range of the labels is in the 0 to 5 interval.
Thus, the comments with a rate of zero were remmoved. |
CyberHarem/evelyn_neuralcloud | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of evelyn/イヴリン/伊芙琳 (Neural Cloud)
This is the dataset of evelyn/イヴリン/伊芙琳 (Neural Cloud), containing 19 images and their tags.
The core tags of this character are `long_hair, red_eyes, eyepatch, bangs, breasts, white_hair, large_breasts, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 38.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelyn_neuralcloud/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 19.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelyn_neuralcloud/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 49 | 39.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelyn_neuralcloud/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 33.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelyn_neuralcloud/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 49 | 58.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/evelyn_neuralcloud/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/evelyn_neuralcloud',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, nipples, solo, cum_on_breasts, nude, blush, closed_mouth, navel, pussy |
| 1 | 7 |  |  |  |  |  | 1girl, solo, closed_mouth, holding_gun, pantyhose, red_gloves, tactical_clothes, black_footwear, boots, cape, bulletproof_vest, handgun, long_sleeves, looking_at_viewer, outdoors, pouch, rifle, shotgun_shell, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | nipples | solo | cum_on_breasts | nude | blush | closed_mouth | navel | pussy | holding_gun | pantyhose | red_gloves | tactical_clothes | black_footwear | boots | cape | bulletproof_vest | handgun | long_sleeves | outdoors | pouch | rifle | shotgun_shell | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:----------|:-------|:-----------------|:-------|:--------|:---------------|:--------|:--------|:--------------|:------------|:-------------|:-------------------|:-----------------|:--------|:-------|:-------------------|:----------|:---------------|:-----------|:--------|:--------|:----------------|:-----------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
heliosprime/twitter_dataset_1713203083 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 27444
num_examples: 75
download_size: 23340
dataset_size: 27444
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713203083"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nguyenthanhdo/goodboi | ---
dataset_info:
features:
- name: source
dtype: string
- name: from
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 3778115.48
num_examples: 1000
download_size: 1745688
dataset_size: 3778115.48
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
johannes-garstenauer/embeddings_from_distilbert_masking_heaps_and_eval_part0_test | ---
dataset_info:
features:
- name: struct
dtype: string
- name: label
dtype: int64
- name: pred
dtype: int64
- name: cls_layer_6
sequence: float32
- name: cls_layer_5
sequence: float32
- name: cls_layer_4
sequence: float32
splits:
- name: train
num_bytes: 13428556
num_examples: 1408
download_size: 16660183
dataset_size: 13428556
---
# Dataset Card for "embeddings_from_distilbert_masking_heaps_and_eval_part0_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lamini/taylor_swift | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 850749.3
num_examples: 783
- name: test
num_bytes: 94527.7
num_examples: 87
download_size: 303257
dataset_size: 945277.0
---
# Dataset Card for "taylor_swift"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
allenai/multinews_dense_oracle | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- other
multilinguality:
- monolingual
pretty_name: Multi-News
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- summarization
task_ids:
- news-articles-summarization
paperswithcode_id: multi-news
train-eval-index:
- config: default
task: summarization
task_id: summarization
splits:
train_split: train
eval_split: test
col_mapping:
document: text
summary: target
metrics:
- type: rouge
name: Rouge
---
This is a copy of the [Multi-News](https://huggingface.co/datasets/multi_news) dataset, except the input source documents of the `train`, `validation`, and `test` splits have been replaced by a __dense__ retriever. The retrieval pipeline used:
- __query__: The `summary` field of each example
- __corpus__: The union of all documents in the `train`, `validation` and `test` splits
- __retriever__: [`facebook/contriever-msmarco`](https://huggingface.co/facebook/contriever-msmarco) via [PyTerrier](https://pyterrier.readthedocs.io/en/latest/) with default settings
- __top-k strategy__: `"oracle"`, i.e. the number of documents retrieved, `k`, is set as the original number of input documents for each example
Retrieval results on the `train` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.8661 | 0.6867 | 0.6867 | 0.6867 |
Retrieval results on the `validation` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.8626 | 0.6859 | 0.6859 | 0.6859 |
Retrieval results on the `test` set:
| Recall@100 | Rprec | Precision@k | Recall@k |
| ----------- | ----------- | ----------- | ----------- |
| 0.8625 | 0.6927 | 0.6927 | 0.6927 | |
sam-mosaic/vicuna_alpaca_hc3_chatml | ---
language: en
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 387859366
num_examples: 170637
download_size: 146603814
dataset_size: 387859366
---
# Dataset Card for "vicuna_alpaca_hc3_chatml"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rcds/swiss_citation_extraction | ---
license: cc-by-sa-4.0
task_categories:
- token-classification
language:
- de
- fr
- it
pretty_name: Swiss Citation Extraction
size_categories:
- 100K<n<1M
---
# Dataset Card for Swiss Citation Extraction
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Swiss Citation Extraction is a multilingual, diachronic dataset of 131K Swiss Federal Supreme Court (FSCS) cases. This dataset is part of a challenging token classification task.
### Supported Tasks and Leaderboards
### Languages
Switzerland has four official languages with three languages German, French and Italian being represenated. The decisions are written by the judges and clerks in the language of the proceedings.
| Language | Subset | Number of Documents |
|------------|------------|----------------------|
| German | **de** | 85K |
| French | **fr** | 38K |
| Italian | **it** | 8K |
## Dataset Structure
### Data Fields
```
decision_id:
considerations:
NER_labels: CITATION refers to a case citation or a reference to another court decision. LAW indicates a reference to a specific law. O is used for words or tokens that don't fall under the previous two labels. In accordance with the IOB format, each tag, apart from 'O', is accompanied by the 'B-' prefix if it marks the beginning of the span, or the 'I-' prefix if it's inside or at the end of the span.
law_area: (string)
language: (string)
year: (int64)
chamber: (string)
region: (string)
```
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
The original data are published from the Swiss Federal Supreme Court (https://www.bger.ch) in unprocessed formats (HTML). The documents were downloaded from the Entscheidsuche portal (https://entscheidsuche.ch) in HTML.
#### Who are the source language producers?
The decisions are written by the judges and clerks in the language of the proceedings.
### Annotations
#### Annotation process
#### Who are the annotators?
Metadata is published by the Swiss Federal Supreme Court (https://www.bger.ch).
### Personal and Sensitive Information
The dataset contains publicly available court decisions from the Swiss Federal Supreme Court. Personal or sensitive information has been anonymized by the court before publication according to the following guidelines: https://www.bger.ch/home/juridiction/anonymisierungsregeln.html.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
We release the data under CC-BY-4.0 which complies with the court licensing (https://www.bger.ch/files/live/sites/bger/files/pdf/de/urteilsveroeffentlichung_d.pdf)
© Swiss Federal Supreme Court, 2002-2022
The copyright for the editorial content of this website and the consolidated texts, which is owned by the Swiss Federal Supreme Court, is licensed under the Creative Commons Attribution 4.0 International licence. This means that you can re-use the content provided you acknowledge the source and indicate any changes you have made.
Source: https://www.bger.ch/files/live/sites/bger/files/pdf/de/urteilsveroeffentlichung_d.pdf
### Citation Information
Please cite our [ArXiv-Preprint](https://arxiv.org/abs/2306.09237)
```
@misc{rasiah2023scale,
title={SCALE: Scaling up the Complexity for Advanced Language Model Evaluation},
author={Vishvaksenan Rasiah and Ronja Stern and Veton Matoshi and Matthias Stürmer and Ilias Chalkidis and Daniel E. Ho and Joel Niklaus},
year={2023},
eprint={2306.09237},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions |
Malvinan/mt5_shuffled_language_modeling | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: language
dtype: string
- name: image_list
sequence: string
- name: annotations
sequence: string
- name: input_token_ids
sequence:
sequence: int64
- name: output_token_ids
sequence:
sequence: int64
splits:
- name: train
num_bytes: 20543789248
num_examples: 1634884
- name: validation
num_bytes: 117470622
num_examples: 9166
download_size: 3530874073
dataset_size: 20661259870
---
# Dataset Card for "mt5_shuffled_language_modeling"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
guitdenis/penal | ---
license: unknown
---
|
Kutches/Danganronpa | ---
license: openrail
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-50000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 662539
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autumnjohnson/ceti_audio | ---
size_categories:
- 1K<n<10K
pretty_name: Project CETI (Cetacean Translation Initiative) audio
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: coda_type
dtype: string
- name: path
dtype: string
- name: sampling_rate
dtype: int64
splits:
- name: train
num_bytes: 295401207.9840547
num_examples: 3160
- name: test
num_bytes: 32905451.01594533
num_examples: 352
download_size: 162207534
dataset_size: 328306659.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "ceti_audio"
## Table of Contents
- [Dataset Card for "ceti\_audio"](#dataset-card-for-ceti_audio)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@autumnjohnson](https://github.com/<github-username>) for adding this dataset. |
CyberHarem/enterprise_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of enterprise/エンタープライズ/企业 (Azur Lane)
This is the dataset of enterprise/エンタープライズ/企业 (Azur Lane), containing 500 images and their tags.
The core tags of this character are `long_hair, purple_eyes, white_hair, breasts, hat, large_breasts, white_headwear, peaked_cap, very_long_hair, military_hat, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 774.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/enterprise_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 397.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/enterprise_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1267 | 854.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/enterprise_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 665.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/enterprise_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1267 | 1.23 GiB | [Download](https://huggingface.co/datasets/CyberHarem/enterprise_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/enterprise_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, black_coat, black_necktie, looking_at_viewer, miniskirt, open_coat, pleated_skirt, simple_background, sleeveless_shirt, solo, underbust, white_background, black_belt, bare_shoulders, collared_shirt, black_skirt, black_thighhighs, smile, white_shirt |
| 1 | 9 |  |  |  |  |  | 1girl, black_belt, black_coat, black_necktie, black_skirt, blue_sky, miniskirt, open_coat, pleated_skirt, sleeveless_shirt, solo, looking_at_viewer, cloud, underbust, cowboy_shot, day, black_thighhighs, collared_shirt, eagle |
| 2 | 6 |  |  |  |  |  | 1girl, bare_shoulders, black_belt, black_coat, black_necktie, miniskirt, open_coat, pleated_skirt, sleeveless_shirt, solo, collared_shirt, thighhighs, underbust |
| 3 | 52 |  |  |  |  |  | 1girl, black_necktie, sleeveless_shirt, solo, collared_shirt, white_shirt, upper_body, bare_shoulders, looking_at_viewer, black_coat, open_coat, hair_between_eyes, off_shoulder, simple_background, white_background, smile |
| 4 | 11 |  |  |  |  |  | 1girl, white_shirt, baseball_cap, looking_at_viewer, solo, sunglasses, jacket_around_waist, necklace, smile, bare_shoulders, black_headwear, eyewear_on_headwear, off-shoulder_shirt, clothes_writing, collarbone, wristband, short_shorts, black_shorts, bra_strap, closed_mouth, sidelocks, sitting, wristwatch |
| 5 | 6 |  |  |  |  |  | 1girl, black_pantyhose, boots, christmas, looking_at_viewer, smile, solo, white_footwear, winter_clothes, black_belt, earmuffs, coat, fur_trim, white_gloves, blush, pom_pom_(clothes), reindeer, sitting, skirt, white_scarf |
| 6 | 7 |  |  |  |  |  | 1girl, earmuffs, smile, solo, white_gloves, white_scarf, winter_clothes, christmas, looking_at_viewer, black_belt, fur_trim, official_alternate_costume, white_coat, blush, cape, open_mouth, sweater |
| 7 | 10 |  |  |  |  |  | 1girl, coat, gift_box, solo, winter_clothes, christmas, holding_gift, looking_at_viewer, white_gloves, blush, earmuffs, smile, white_scarf, black_belt, open_mouth, sweater |
| 8 | 14 |  |  |  |  |  | 1girl, official_alternate_costume, race_queen, solo, looking_at_viewer, ponytail, sunglasses, bare_shoulders, cleavage, thighhighs, blush, eyewear_on_headwear, jacket, off_shoulder, thighs, collarbone, halter_dress, tinted_eyewear, boots, hair_through_headwear, white_belt, holding, outdoors, panties, partially_fingerless_gloves, single_fingerless_glove, sitting |
| 9 | 15 |  |  |  |  |  | 1girl, china_dress, hair_flower, solo, white_dress, coat, fur_trim, looking_at_viewer, smile, gold_trim, hoop_earrings, blush, hair_between_eyes, official_alternate_costume, pelvic_curtain, side_slit, sidelocks, simple_background, white_background |
| 10 | 11 |  |  |  |  |  | 1girl, bridal_veil, looking_at_viewer, solo, wedding_dress, white_dress, cleavage, smile, blush, necklace, bride, bare_shoulders, earrings, holding_bouquet, white_rose, closed_mouth, hair_between_eyes, sidelocks, collarbone, mini_crown, sky, tiara |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_coat | black_necktie | looking_at_viewer | miniskirt | open_coat | pleated_skirt | simple_background | sleeveless_shirt | solo | underbust | white_background | black_belt | bare_shoulders | collared_shirt | black_skirt | black_thighhighs | smile | white_shirt | blue_sky | cloud | cowboy_shot | day | eagle | thighhighs | upper_body | hair_between_eyes | off_shoulder | baseball_cap | sunglasses | jacket_around_waist | necklace | black_headwear | eyewear_on_headwear | off-shoulder_shirt | clothes_writing | collarbone | wristband | short_shorts | black_shorts | bra_strap | closed_mouth | sidelocks | sitting | wristwatch | black_pantyhose | boots | christmas | white_footwear | winter_clothes | earmuffs | coat | fur_trim | white_gloves | blush | pom_pom_(clothes) | reindeer | skirt | white_scarf | official_alternate_costume | white_coat | cape | open_mouth | sweater | gift_box | holding_gift | race_queen | ponytail | cleavage | jacket | thighs | halter_dress | tinted_eyewear | hair_through_headwear | white_belt | holding | outdoors | panties | partially_fingerless_gloves | single_fingerless_glove | china_dress | hair_flower | white_dress | gold_trim | hoop_earrings | pelvic_curtain | side_slit | bridal_veil | wedding_dress | bride | earrings | holding_bouquet | white_rose | mini_crown | sky | tiara |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------------|:----------------|:--------------------|:------------|:------------|:----------------|:--------------------|:-------------------|:-------|:------------|:-------------------|:-------------|:-----------------|:-----------------|:--------------|:-------------------|:--------|:--------------|:-----------|:--------|:--------------|:------|:--------|:-------------|:-------------|:--------------------|:---------------|:---------------|:-------------|:----------------------|:-----------|:-----------------|:----------------------|:---------------------|:------------------|:-------------|:------------|:---------------|:---------------|:------------|:---------------|:------------|:----------|:-------------|:------------------|:--------|:------------|:-----------------|:-----------------|:-----------|:-------|:-----------|:---------------|:--------|:--------------------|:-----------|:--------|:--------------|:-----------------------------|:-------------|:-------|:-------------|:----------|:-----------|:---------------|:-------------|:-----------|:-----------|:---------|:---------|:---------------|:-----------------|:------------------------|:-------------|:----------|:-----------|:----------|:------------------------------|:--------------------------|:--------------|:--------------|:--------------|:------------|:----------------|:-----------------|:------------|:--------------|:----------------|:--------|:-----------|:------------------|:-------------|:-------------|:------|:--------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | X | | X | | X | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | | X | X | X | | X | X | X | | X | X | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 52 |  |  |  |  |  | X | X | X | X | | X | | X | X | X | | X | | X | X | | | X | X | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 11 |  |  |  |  |  | X | | | X | | | | | | X | | | | X | | | | X | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | | X | | | | | | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | | | X | | | | | | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | | X | X | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 10 |  |  |  |  |  | X | | | X | | | | | | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | | X | X | | | | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 14 |  |  |  |  |  | X | | | X | | | | | | X | | | | X | | | | | | | | | | | X | | | X | | X | | | | X | | | X | | | | | | | X | | | X | | | | | | | | X | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 9 | 15 |  |  |  |  |  | X | | | X | | | | X | | X | | X | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | X | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | |
| 10 | 11 |  |  |  |  |  | X | | | X | | | | | | X | | | | X | | | | X | | | | | | | | | X | | | | | X | | | | | X | | | | | X | X | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X |
|
ktrinh38/ldm-fashion-unsegmented | ---
dataset_info:
features:
- name: segmentation
dtype: image
- name: image
dtype: image
splits:
- name: train
num_bytes: 14116953824.608
num_examples: 5116
download_size: 4690604609
dataset_size: 14116953824.608
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
maidalun1020/CrosslingualRetrievalOthersZh2En-qrels | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
dataset_info:
features:
- name: qid
dtype: string
- name: pid
dtype: string
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 557424
num_examples: 23003
download_size: 295231
dataset_size: 557424
---
|
empbetty/dogSamples | ---
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 784103393.0
num_examples: 25064
download_size: 782573601
dataset_size: 784103393.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dogSamples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
laurent255/octave | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 2196528.0
num_examples: 268
- name: test
num_bytes: 245880.0
num_examples: 30
download_size: 1126626
dataset_size: 2442408.0
---
# Dataset Card for "octave"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Noor0/AFRD_Arabic-Fake-Reviews-Detection | ---
license: cc-by-4.0
---
# AFRD: Arabic Fake Reviews Detection dataset
- [Description](#description)
- [Citation](#citation)
## Description
Arabic Fake Reviews Detection (AFRD) is the first gold-standard dataset comprised of three domains, namely, hotel, restaurant, and product domains. Each domain has a set of attributes, the reviewer’s age, the reviewer’s gender, the service name, the review’s text, the rating, the text’s polarity, and the review’s class. The overall balanced dataset is consisted of 1728 reviews, 310 reviews for the hotel domain, 714 reviews for the restaurant domain, and 704 reviews for the product domain, the two classes in each domain are balanced. However, there are unbalanced version with 1958 reviews. The following table demonstrate the number of reviews in each class for the balanced dataset:
| Domain | Fake class | Truthful class | Total |
|--------------|------------|----------------|---------|
| Hotel | 155 | 155 | 310 |
| Restaurant | 357 | 357 | 714 |
| Product | 352 | 352 | 704 |
| Multi-domain | 864 | 864 | 1728 |
Moreover, the review sentiment is balanced in each class. Following figure shows how the negative and positive reviews are balanced:

For more information refer to the paper:
[Multiscale cascaded domain-based approach for Arabic fake reviews detection in e-commerce platforms
](https://www.sciencedirect.com/science/article/pii/S1319157824000156#sec4
)
## Citation
Please cite the following paper if you used the dataset:
Qandos, N., Hamad, G., Alharbi, M., Alturki, S., Alharbi, W., & Albelaihi, A. A. (2024). Multiscale cascaded domain-based approach for Arabic fake reviews detection in e-commerce platforms. Journal of King Saud University-Computer and Information Sciences, 101926.
|
deepmind/code_contests | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- translation
task_ids: []
paperswithcode_id: codecontests
pretty_name: CodeContests
---
# Dataset Card for CodeContests
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Repository:** https://github.com/deepmind/code_contests/
- **Paper:** [Competition-Level Code Generation with AlphaCode](https://arxiv.org/abs/2203.07814v1)
- **Leaderboard:** [Code Generation on CodeContests](https://paperswithcode.com/sota/code-generation-on-codecontests)
- **Point of Contact:** [David Choi](mailto:david.hu.choi@gmail.com)
### Dataset Summary
CodeContests is a competitive programming dataset for machine-learning. This
dataset was used when training [AlphaCode](https://deepmind.com/blog/article/Competitive-programming-with-AlphaCode).
It consists of programming problems, from a variety of sources:
Site | URL | Source
----------- | --------------------------- | ------
Aizu | https://judge.u-aizu.ac.jp | [CodeNet](https://github.com/IBM/Project_CodeNet)
AtCoder | https://atcoder.jp | [CodeNet](https://github.com/IBM/Project_CodeNet)
CodeChef | https://www.codechef.com | [description2code](https://github.com/ethancaballero/description2code)
Codeforces | https://codeforces.com | [description2code](https://github.com/ethancaballero/description2code) and Codeforces
HackerEarth | https://www.hackerearth.com | [description2code](https://github.com/ethancaballero/description2code)
Problems include test cases in the form of paired inputs and outputs, as well as both correct and incorrect human solutions in a variety of languages.
### Supported Tasks and Leaderboards
- `translation` - the competitive programming code generation problem can be viewed as a sequence-to-sequence translation task: given a problem description 𝑋 in natural language, produce a corresponding solution 𝑌 in a programming language. The metric used for evaluation is "percentage of problems solved using 𝑛 submissions from 𝑘 samples per problem", denoted as 𝑛@𝑘. More information on the evaluation of AlphaCode can be found in Section 2.2. and Appendix A.3. of the paper. The leaderboard for this task is available [here](https://paperswithcode.com/sota/code-generation-on-codecontests).
### Languages
English.
## Dataset Structure
### Data Instances
A data point corresponds to a singular contest problem:
```
{
'name': '76_B. Mice',
'description': 'Modern researches has shown that a flock of hungry mice '
'searching for a piece of...',
'public_tests': {'input': ['3 2 0 2\n0 1 3\n2 5\n'], 'output': ['1\n']},
'private_tests': {'input': ['20 18 1 2\n'
'-9999944 -9999861 -9999850 -9999763 -9999656 '
'-9999517 -9999375 -999927...',
...,
'7 11 10 20\n'
'6 18 32 63 66 68 87\n'
'6 8 15 23 25 41 53 59 60 75 90\n'],
'output': ['2\n', ..., '1\n']},
'generated_tests': {'input': ['7 11 10 5\n'
'6 18 32 63 66 68 87\n'
'6 8 15 23 25 41 53 59 60 75 90\n',
...,
'7 11 10 4\n'
'6 18 46 63 85 84 87\n'
'6 8 15 18 25 41 53 59 60 75 90\n'],
'output': ['1\n', ..., '2\n']},
'source': 2,
'difficulty': 8,
'solutions': {'language': [2, ..., 2],
'solution': ['#include <bits/stdc++.h>\n'
'using namespace std;\n'
'int n, m;\n'
'int data[2][100010], t[1...',
...,
'#include <bits/stdc++.h>\n'
'using namespace std;\n'
'int n, m, pos[100100], food[100100...']},
'incorrect_solutions': {'language': [2, ..., 2],
'solution': ['#include <bits/stdc++.h>\n'
'using namespace std;\n'
'vector<pair<int, int> > v[100010];...',
...,
'#include <bits/stdc++.h>\n'
'using namespace std;\n'
'vector<pair<int, int> > v[100010];...']},
'cf_contest_id': 76,
'cf_index': 'B',
'cf_points': 0.0,
'cf_rating': 2100,
'cf_tags': ['greedy', 'two pointers'],
'is_description_translated': False,
'untranslated_description': '',
'time_limit': {'seconds': 0, 'nanos': 500000000},
'memory_limit_bytes': 256000000,
'input_file': '',
'output_file': ''
}
```
### Data Fields
- `name`: The name of the contest. Note that names could agree between different sources.
- `description`: A natural language description of a programming problem.
- `public_tests`: Public tests are those that are available before submitting a solution, typically as part of the description itself. Represented as a paired `input` and `output` that can be used to test potential solutions. They are therefore acceptable inputs to a model.
- `private_tests`: Private tests are not visible before submitting a solution, so should not be made available as inputs to a model.
- `generated_tests`: Generated tests are automatically generated by modifying inputs from public and private tests and validating using known correct solutions.
- `source`: The original source of the problem, with possible values including `UNKNOWN_SOURCE` (0),`CODECHEF` (1), `CODEFORCES` (2), `HACKEREARTH` (3), `CODEJAM` (4), `ATCODER` (5) and `AIZU` (6).
- `difficulty`: A representation of the difficulty of the problem with possible values including `UNKNOWN_DIFFICULTY` (0), `EASY` (1), `MEDIUM` (2), `HARD` (3), `HARDER` (4), `HARDEST` (5), `EXTERNAL` (6), `A` (7), `B` (8), `C` (9), `D` (10), `E` (11), `F` (12), `G` (13), `H` (14), `I` (15), `J` (16), `K` (17), `L` (18), `M` (19), `N` (20), `O` (21), `P` (22), `Q` (23), `R` (24), `S` (25), `T` (26), `U` (27) and `V` (28). Note that different sources use different, non-comparable gradings. For Codeforces problems, `cf_rating` is a more reliable measure of difficulty when available.
- `solutions`: Correct solutions to the problem. Contrast with `incorrect_solutions` below.
- `incorrect_solutions`: Incorrect solutions.
- `cf_contest_id`: The Contest ID. Note that Contest ID is not monotonic with respect to time.
- `cf_index`: Problem index, e.g. `"A"` or `"B"` or `"C"`.
- `cf_points`: Points for the problem, e.g. `1000.0`
- `cf_rating`: Problem rating (difficulty), e.g. `1100`
- `cf_tags`: Problem tags, e.g. `['greedy', 'math']`
- `is_description_translated`: Whether the problem was translated to English.
- `untranslated_description`: The untranslated description is only available for translated problems.
- `time_limit`: The time limit constraint to use when executing solutions. Represented as a dictionary with two keys, `seconds` and `nanos`. This field is None if not defined.
- `memory_limit_bytes`: The memory limit constraint to use when executing solutions.
- `input_file`: Most problems use stdin for IO. Some problems expect specific files to be used instead.
- `output_file`: Most problems use stdout for IO. Some problems expect specific files to be used instead.
All tests are represented as a paired `input` and `output` that can be used to test potential solutions and all solutions comprise a `language`, with possible values including `UNKNOWN_LANGUAGE` (0), `PYTHON` (1) (solutions written in PYTHON2), `CPP` (2), `PYTHON3` (3) and `JAVA` (4), and a `solution` string written in that `language`. The fields preceded with `cf_` denote extra meta-data for Codeforces problems.
### Data Splits
The data is split into training, validation and test set. The training set contains 13328 samples, the validation set 117 samples and the test set 165 samples.
## Dataset Creation
### Curation Rationale
This dataset was created for fine-tuning AlphaCode models:
> Models pre-trained on GitHub can generate good code and solve simple programming problems, but
as shown in Appendix B.3 they can solve very few competitive programming problems. Fine-tuning
the model on a dedicated competitive programming dataset is critical for performance.
### Source Data
#### Initial Data Collection and Normalization
The information on the data collection and normalization procedures can found in Section 3.2. and Appendinx B.2. of the paper.
#### Who are the source language producers?
The problems are scraped from the following platforms: [Aizu](https://judge.u-aizu.ac.jp), [AtCoder](https://atcoder.jp ), [CodeChef](https://www.codechef.com), [Codeforces](https://codeforces.com) and [HackerEarch](https://www.hackerearth.com). Additionally, some data from the existing public competitive programming dataset Description2Code ([Caballero et al., 2016](https://github.com/ethancaballero/description2code)) and CodeNet ([(Puri et al., 2021](https://arxiv.org/pdf/2105.12655.pdf)) is mixed into the training set.
### Annotations
#### Annotation process
The solutions are scapred alongside the problem descriptions.
#### Who are the annotators?
Same as the source data creators.
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Yujia Li, David Choi, Junyoung Chung, Nate Kushman, Julian Schrittwieser, Rémi Leblond, Tom Eccles, James Keeling, Felix Gimeno, Agustin Dal Lago, Thomas Hubert, Peter Choy, Cyprien de Masson d'Autume, Igor Babuschkin, Xinyun Chen, Po-Sen Huang, Johannes Welbl, Sven Gowal, Alexey Cherepanov, James Molloy, Daniel J. Mankowitz, Esme Sutherland Robson, Pushmeet Kohli, Nando de Freitas, Koray Kavukcuoglu and Oriol Vinyals.
### Licensing Information
This dataset is made available under the terms of the CC BY
4.0 license ([Creative Commons Attribution 4.0 International license](https://creativecommons.org/licenses/by/4.0/legalcode)).
Additional acknowledged contributions:
* Codeforces materials are sourced from http://codeforces.com.
* Description2Code materials are sourced from:
[Description2Code Dataset](https://github.com/ethancaballero/description2code),
licensed under the
[MIT open source license](https://opensource.org/licenses/MIT), copyright
not specified.
* CodeNet materials are sourced from:
[Project_CodeNet](https://github.com/IBM/Project_CodeNet), licensed under
[Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0), copyright not
specified.
### Citation Information
```bibtex
@article{li2022competition,
title={Competition-Level Code Generation with AlphaCode},
author={Li, Yujia and Choi, David and Chung, Junyoung and Kushman, Nate and
Schrittwieser, Julian and Leblond, R{\'e}mi and Eccles, Tom and
Keeling, James and Gimeno, Felix and Dal Lago, Agustin and
Hubert, Thomas and Choy, Peter and de Masson d'Autume, Cyprien and
Babuschkin, Igor and Chen, Xinyun and Huang, Po-Sen and Welbl, Johannes and
Gowal, Sven and Cherepanov, Alexey and Molloy, James and
Mankowitz, Daniel and Sutherland Robson, Esme and Kohli, Pushmeet and
de Freitas, Nando and Kavukcuoglu, Koray and Vinyals, Oriol},
journal={arXiv preprint arXiv:2203.07814},
year={2022}
}
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
genaibook/images | ---
license: mit
---
|
distil-whisper/voxpopuli | ---
license: cc0-1.0
task_categories:
- automatic-speech-recognition
language:
- en
-pretty_name: VoxPopuli
---
# Distil Whisper: VoxPopuli
This is a variant of the [VoxPopuli](https://huggingface.co/datasets/facebook/voxpopuli) dataset, augmented to return the pseudo-labelled Whisper
Transcriptions alongside the original dataset elements. The pseudo-labelled transcriptions were generated by
labelling the input audio data with the Whisper [large-v2](https://huggingface.co/openai/whisper-large-v2)
model with *greedy* sampling. For information on how the original dataset was curated, refer to the original
[dataset card](https://huggingface.co/datasets/facebook/voxpopuli).
## Standalone Usage
First, install the latest version of the 🤗 Datasets package:
```bash
pip install --upgrade pip
pip install --upgrade datasets[audio]
```
The dataset can be downloaded and pre-processed on disk using the [`load_dataset`](https://huggingface.co/docs/datasets/v2.14.5/en/package_reference/loading_methods#datasets.load_dataset)
function:
```python
from datasets import load_dataset
dataset = load_dataset("distil-whisper/voxpopuli", "en")
# take the first sample of the validation set
sample = dataset["validation"][0]
```
It can also be streamed directly from the Hub using Datasets' [streaming mode](https://huggingface.co/blog/audio-datasets#streaming-mode-the-silver-bullet).
Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire
dataset to disk:
```python
from datasets import load_dataset
dataset = load_dataset("distil-whisper/voxpopuli", "en", streaming=True)
# take the first sample of the validation set
sample = next(iter(dataset["validation"]))
```
## Distil Whisper Usage
To use this dataset to reproduce a Distil Whisper training run, refer to the instructions on the
[Distil Whisper repository](https://github.com/huggingface/distil-whisper#training).
## License
This dataset is licensed under cc0-1.0.
|
csac/lisenca | ---
license: other
license_name: gay
license_link: LICENSE
---
|
CyberHarem/i_19_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of i_19/伊19/伊19 (Kantai Collection)
This is the dataset of i_19/伊19/伊19 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `blue_hair, long_hair, ribbon, hair_ribbon, breasts, red_eyes, twintails, large_breasts, symbol-shaped_pupils, star-shaped_pupils, tri_tails, fang`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 572.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_19_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 344.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_19_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1227 | 757.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_19_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 512.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/i_19_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1227 | 1.02 GiB | [Download](https://huggingface.co/datasets/CyberHarem/i_19_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/i_19_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, blue_one-piece_swimsuit, flower-shaped_pupils, name_tag, school_swimsuit, simple_background, solo, white_background, looking_at_viewer, one-hour_drawing_challenge, smile, star_(symbol), open_mouth, blush, skin_fang, twitter_username |
| 1 | 12 |  |  |  |  |  | 1girl, one-piece_swimsuit, open_mouth, school_swimsuit, solo, blush, looking_at_viewer, star_(symbol), torpedo, smile, name_tag, hair_ornament |
| 2 | 5 |  |  |  |  |  | 1girl, blush, cleavage, looking_at_viewer, one-piece_swimsuit, open_mouth, school_swimsuit, solo, star_(symbol), torpedo, :d, hair_ornament, name_tag, water, barefoot |
| 3 | 13 |  |  |  |  |  | 1boy, 1girl, hetero, open_mouth, penis, solo_focus, blush, school_swimsuit, nipples, one-piece_swimsuit, smile, paizuri, heart-shaped_pupils, looking_at_viewer, star_(symbol), bar_censor, cum_on_breasts, huge_breasts |
| 4 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, one-piece_swimsuit, open_mouth, school_swimsuit, sex, solo_focus, vaginal, penis, swimsuit_aside, cum_in_pussy, mosaic_censoring, on_back, breasts_out, folded, heart-shaped_pupils, leg_grab, missionary, torn_swimsuit |
| 5 | 8 |  |  |  |  |  | 1girl, pleated_skirt, serafuku, solo, alternate_costume, flower-shaped_pupils, simple_background, smile, white_background, blue_sailor_collar, looking_at_viewer, open_mouth, star_(symbol), blue_skirt, cowboy_shot, multicolored_hair, neckerchief, long_sleeves, short_sleeves, dated, one-hour_drawing_challenge, shirt |
| 6 | 6 |  |  |  |  |  | flower-shaped_pupils, playboy_bunny, rabbit_ears, simple_background, strapless_leotard, 1girl, alternate_costume, cleavage, detached_collar, fake_animal_ears, solo, white_background, wrist_cuffs, black_leotard, looking_at_viewer, open_mouth, star_(symbol), blush, bowtie, gloves, holding, smile, tray |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_one-piece_swimsuit | flower-shaped_pupils | name_tag | school_swimsuit | simple_background | solo | white_background | looking_at_viewer | one-hour_drawing_challenge | smile | star_(symbol) | open_mouth | blush | skin_fang | twitter_username | one-piece_swimsuit | torpedo | hair_ornament | cleavage | :d | water | barefoot | 1boy | hetero | penis | solo_focus | nipples | paizuri | heart-shaped_pupils | bar_censor | cum_on_breasts | huge_breasts | sex | vaginal | swimsuit_aside | cum_in_pussy | mosaic_censoring | on_back | breasts_out | folded | leg_grab | missionary | torn_swimsuit | pleated_skirt | serafuku | alternate_costume | blue_sailor_collar | blue_skirt | cowboy_shot | multicolored_hair | neckerchief | long_sleeves | short_sleeves | dated | shirt | playboy_bunny | rabbit_ears | strapless_leotard | detached_collar | fake_animal_ears | wrist_cuffs | black_leotard | bowtie | gloves | holding | tray |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------------|:-----------------------|:-----------|:------------------|:--------------------|:-------|:-------------------|:--------------------|:-----------------------------|:--------|:----------------|:-------------|:--------|:------------|:-------------------|:---------------------|:----------|:----------------|:-----------|:-----|:--------|:-----------|:-------|:---------|:--------|:-------------|:----------|:----------|:----------------------|:-------------|:-----------------|:---------------|:------|:----------|:-----------------|:---------------|:-------------------|:----------|:--------------|:---------|:-----------|:-------------|:----------------|:----------------|:-----------|:--------------------|:---------------------|:-------------|:--------------|:--------------------|:--------------|:---------------|:----------------|:--------|:--------|:----------------|:--------------|:--------------------|:------------------|:-------------------|:--------------|:----------------|:---------|:---------|:----------|:-------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | | | X | X | | X | | X | | X | X | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | | X | X | | X | | X | | | X | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 13 |  |  |  |  |  | X | | | | X | | | | X | | X | X | X | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | X | | | | | | | | X | X | | | X | | | | | | | X | X | X | X | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | X | | | X | X | X | X | | X | X | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
joey234/mmlu-miscellaneous-neg-answer | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_answer
dtype: string
splits:
- name: test
num_bytes: 170602
num_examples: 783
download_size: 117116
dataset_size: 170602
---
# Dataset Card for "mmlu-miscellaneous-neg-answer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_nbeerbower__flammen10-mistral-7B | ---
pretty_name: Evaluation run of nbeerbower/flammen10-mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nbeerbower/flammen10-mistral-7B](https://huggingface.co/nbeerbower/flammen10-mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__flammen10-mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T14:50:15.022648](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__flammen10-mistral-7B/blob/main/results_2024-03-24T14-50-15.022648.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.653435030234401,\n\
\ \"acc_stderr\": 0.032096768580134716,\n \"acc_norm\": 0.6526958863367851,\n\
\ \"acc_norm_stderr\": 0.03276721111414514,\n \"mc1\": 0.5581395348837209,\n\
\ \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.7223046198250735,\n\
\ \"mc2_stderr\": 0.014602317645003922\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6988054607508533,\n \"acc_stderr\": 0.013406741767847634,\n\
\ \"acc_norm\": 0.7175767918088737,\n \"acc_norm_stderr\": 0.013155456884097225\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7106154152559251,\n\
\ \"acc_stderr\": 0.004525499540017861,\n \"acc_norm\": 0.882692690699064,\n\
\ \"acc_norm_stderr\": 0.0032112847607016566\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224469,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224469\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924006,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924006\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857416,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857416\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461766,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461766\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608308,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608308\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42905027932960893,\n\
\ \"acc_stderr\": 0.016553287863116033,\n \"acc_norm\": 0.42905027932960893,\n\
\ \"acc_norm_stderr\": 0.016553287863116033\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"\
acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6879084967320261,\n \"acc_stderr\": 0.01874501120127766,\n \
\ \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.01874501120127766\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5581395348837209,\n\
\ \"mc1_stderr\": 0.01738476747898621,\n \"mc2\": 0.7223046198250735,\n\
\ \"mc2_stderr\": 0.014602317645003922\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8271507498026835,\n \"acc_stderr\": 0.010626964529971847\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.714177407126611,\n \
\ \"acc_stderr\": 0.012444963460615634\n }\n}\n```"
repo_url: https://huggingface.co/nbeerbower/flammen10-mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|arc:challenge|25_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|gsm8k|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hellaswag|10_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T14-50-15.022648.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T14-50-15.022648.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- '**/details_harness|winogrande|5_2024-03-24T14-50-15.022648.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T14-50-15.022648.parquet'
- config_name: results
data_files:
- split: 2024_03_24T14_50_15.022648
path:
- results_2024-03-24T14-50-15.022648.parquet
- split: latest
path:
- results_2024-03-24T14-50-15.022648.parquet
---
# Dataset Card for Evaluation run of nbeerbower/flammen10-mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nbeerbower/flammen10-mistral-7B](https://huggingface.co/nbeerbower/flammen10-mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nbeerbower__flammen10-mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T14:50:15.022648](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__flammen10-mistral-7B/blob/main/results_2024-03-24T14-50-15.022648.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.653435030234401,
"acc_stderr": 0.032096768580134716,
"acc_norm": 0.6526958863367851,
"acc_norm_stderr": 0.03276721111414514,
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.7223046198250735,
"mc2_stderr": 0.014602317645003922
},
"harness|arc:challenge|25": {
"acc": 0.6988054607508533,
"acc_stderr": 0.013406741767847634,
"acc_norm": 0.7175767918088737,
"acc_norm_stderr": 0.013155456884097225
},
"harness|hellaswag|10": {
"acc": 0.7106154152559251,
"acc_stderr": 0.004525499540017861,
"acc_norm": 0.882692690699064,
"acc_norm_stderr": 0.0032112847607016566
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224469,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224469
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924006,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924006
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857416,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461766,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461766
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608308,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608308
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42905027932960893,
"acc_stderr": 0.016553287863116033,
"acc_norm": 0.42905027932960893,
"acc_norm_stderr": 0.016553287863116033
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6879084967320261,
"acc_stderr": 0.01874501120127766,
"acc_norm": 0.6879084967320261,
"acc_norm_stderr": 0.01874501120127766
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5581395348837209,
"mc1_stderr": 0.01738476747898621,
"mc2": 0.7223046198250735,
"mc2_stderr": 0.014602317645003922
},
"harness|winogrande|5": {
"acc": 0.8271507498026835,
"acc_stderr": 0.010626964529971847
},
"harness|gsm8k|5": {
"acc": 0.714177407126611,
"acc_stderr": 0.012444963460615634
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_MayaPH__GodziLLa-30B | ---
pretty_name: Evaluation run of MayaPH/GodziLLa-30B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MayaPH/GodziLLa-30B](https://huggingface.co/MayaPH/GodziLLa-30B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MayaPH__GodziLLa-30B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T01:20:37.554639](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__GodziLLa-30B/blob/main/results_2023-09-17T01-20-37.554639.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.22808305369127516,\n\
\ \"em_stderr\": 0.004297060303049989,\n \"f1\": 0.34862416107382826,\n\
\ \"f1_stderr\": 0.004249472334452047,\n \"acc\": 0.3827162119062479,\n\
\ \"acc_stderr\": 0.006833824703926247\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.22808305369127516,\n \"em_stderr\": 0.004297060303049989,\n\
\ \"f1\": 0.34862416107382826,\n \"f1_stderr\": 0.004249472334452047\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \
\ \"acc_stderr\": 0.0016927007401501802\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702313\n\
\ }\n}\n```"
repo_url: https://huggingface.co/MayaPH/GodziLLa-30B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T01_20_37.554639
path:
- '**/details_harness|drop|3_2023-09-17T01-20-37.554639.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T01-20-37.554639.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T01_20_37.554639
path:
- '**/details_harness|gsm8k|5_2023-09-17T01-20-37.554639.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T01-20-37.554639.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:21:46.977528.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:21:46.977528.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:21:46.977528.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T01_20_37.554639
path:
- '**/details_harness|winogrande|5_2023-09-17T01-20-37.554639.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T01-20-37.554639.parquet'
- config_name: results
data_files:
- split: 2023_07_19T22_21_46.977528
path:
- results_2023-07-19T22:21:46.977528.parquet
- split: 2023_09_17T01_20_37.554639
path:
- results_2023-09-17T01-20-37.554639.parquet
- split: latest
path:
- results_2023-09-17T01-20-37.554639.parquet
---
# Dataset Card for Evaluation run of MayaPH/GodziLLa-30B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MayaPH/GodziLLa-30B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [MayaPH/GodziLLa-30B](https://huggingface.co/MayaPH/GodziLLa-30B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MayaPH__GodziLLa-30B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T01:20:37.554639](https://huggingface.co/datasets/open-llm-leaderboard/details_MayaPH__GodziLLa-30B/blob/main/results_2023-09-17T01-20-37.554639.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.22808305369127516,
"em_stderr": 0.004297060303049989,
"f1": 0.34862416107382826,
"f1_stderr": 0.004249472334452047,
"acc": 0.3827162119062479,
"acc_stderr": 0.006833824703926247
},
"harness|drop|3": {
"em": 0.22808305369127516,
"em_stderr": 0.004297060303049989,
"f1": 0.34862416107382826,
"f1_stderr": 0.004249472334452047
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401501802
},
"harness|winogrande|5": {
"acc": 0.7616416732438832,
"acc_stderr": 0.011974948667702313
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
HuggingFaceM4/debug_MMMU_mcq_to_remove | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: images
sequence: image
- name: question_type
dtype: string
- name: explanation
dtype: string
- name: topic_difficulty
dtype: string
- name: subfield
dtype: string
- name: img_type
dtype: string
splits:
- name: dev
num_bytes: 54175131.86
num_examples: 141
- name: validation
num_bytes: 323691673.26222223
num_examples: 847
- name: test
num_bytes: 3033134749.5304284
num_examples: 9873
download_size: 3305399250
dataset_size: 3411001554.652651
---
# Dataset Card for "debug_MMMU_mcq_to_remove"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pnadel/perseus_grc | ---
dataset_info:
features:
- name: file
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 194782706
num_examples: 783575
download_size: 70938486
dataset_size: 194782706
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "perseus_grc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_81_1713175982 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 362539
num_examples: 850
download_size: 184319
dataset_size: 362539
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yashraizad/yelp-open-dataset-top-businesses | ---
license: apache-2.0
---
|
jakartaresearch/id-paraphrase-detection | ---
annotations_creators:
- found
language:
- id
language_creators:
- found
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Indonesian Paraphrase Detection
size_categories:
- 1K<n<10K
source_datasets:
- extended|msrp
tags:
- msrp
- id-msrp
- paraphrase-detection
task_categories:
- sentence-similarity
task_ids: []
---
# Dataset Card for Indonesian Sentence Paraphrase Detection
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The dataset is originally from [Microsoft Research Paraphrase Corpus](https://www.microsoft.com/en-us/download/details.aspx?id=52398). We translated the text into Bahasa using google translate.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Indonesian
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@andreaschandra](https://github.com/andreaschandra) for adding this dataset. |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/94b530b4 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1339
dataset_size: 182
---
# Dataset Card for "94b530b4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rasu23/qa_all_iter0 | ---
dataset_info:
features:
- name: ids
dtype: int64
- name: source
dtype: string
- name: contxt
dtype: string
- name: predictions
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 596529477
num_examples: 24098
download_size: 159887434
dataset_size: 596529477
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alreadyamar/SaadAI | ---
license: apache-2.0
---
|
nluai/ZaloAI_Bkai | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: validation
num_bytes: 261148
num_examples: 687
download_size: 131460
dataset_size: 261148
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
introspector/lang_agent | ---
license: agpl-3.0
---
|
datahrvoje/twitter_dataset_1713033480 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 19691
num_examples: 45
download_size: 11593
dataset_size: 19691
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BobdoRock/Kithan | ---
license: openrail
---
|
ruliad/factual-expert-processed-v2_subsample_25pct | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4473909162.107858
num_examples: 1601953
download_size: 2684026369
dataset_size: 4473909162.107858
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
thanhduycao/soict_sentence_synthesis | ---
dataset_info:
features:
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 55042
num_examples: 800
download_size: 23910
dataset_size: 55042
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "soict_sentence_synthesis"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
migtissera/Synthia-v1.3 | ---
license: apache-2.0
---
|
Nerfgun3/flower_style | ---
language:
- en
license: creativeml-openrail-m
thumbnail: "https://huggingface.co/datasets/Nerfgun3/flower_style/resolve/main/flower_style_showcase.jpg"
tags:
- stable-diffusion
- text-to-image
- image-to-image
inference: false
---
# Flower Style Embedding / Textual Inversion
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/flower_style/resolve/main/flower_style_showcase.jpg"/>
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
To use it in a prompt: ```"art by flower_style"```
If it is to strong just add [] around it.
Trained until 15000 steps
I added a 7.5k steps trained ver in the files aswell. If you want to use that version, remove the ```"-7500"``` from the file name and replace the 15k steps ver in your folder
Have fun :)
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
Doub7e/SDv2-GPT4Spatial-2000-filtered1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: T5_last_hidden_states
sequence:
sequence:
sequence: float32
splits:
- name: train
num_bytes: 2415734631.25
num_examples: 1870
download_size: 1428677752
dataset_size: 2415734631.25
---
# Dataset Card for "SDv2-GPT4Spatial-200-filtered1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yoonlee/abnormal_cat2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 3818728.0
num_examples: 9
download_size: 3820489
dataset_size: 3818728.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "abnormal_cat2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Lagyamfi/akan_audio | ---
configs:
- config_name: default
data_files: ak.tar.gz
default: true
task_categories:
- translation
language:
- ak
- tw
---
task_categories:
- translation
language:
- tw
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Rachneet__gpt2-xl-alpaca | ---
pretty_name: Evaluation run of Rachneet/gpt2-xl-alpaca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Rachneet/gpt2-xl-alpaca](https://huggingface.co/Rachneet/gpt2-xl-alpaca) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Rachneet__gpt2-xl-alpaca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T05:57:01.634897](https://huggingface.co/datasets/open-llm-leaderboard/details_Rachneet__gpt2-xl-alpaca/blob/main/results_2023-10-15T05-57-01.634897.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00576761744966443,\n\
\ \"em_stderr\": 0.0007755000442814736,\n \"f1\": 0.06548028523489936,\n\
\ \"f1_stderr\": 0.001565882245526754,\n \"acc\": 0.2845303867403315,\n\
\ \"acc_stderr\": 0.00695889831166798\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.00576761744966443,\n \"em_stderr\": 0.0007755000442814736,\n\
\ \"f1\": 0.06548028523489936,\n \"f1_stderr\": 0.001565882245526754\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.569060773480663,\n\
\ \"acc_stderr\": 0.01391779662333596\n }\n}\n```"
repo_url: https://huggingface.co/Rachneet/gpt2-xl-alpaca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|arc:challenge|25_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T05_57_01.634897
path:
- '**/details_harness|drop|3_2023-10-15T05-57-01.634897.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T05-57-01.634897.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T05_57_01.634897
path:
- '**/details_harness|gsm8k|5_2023-10-15T05-57-01.634897.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T05-57-01.634897.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hellaswag|10_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T18:01:10.182884.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T18:01:10.182884.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-18T18:01:10.182884.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T05_57_01.634897
path:
- '**/details_harness|winogrande|5_2023-10-15T05-57-01.634897.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T05-57-01.634897.parquet'
- config_name: results
data_files:
- split: 2023_07_18T18_01_10.182884
path:
- results_2023-07-18T18:01:10.182884.parquet
- split: 2023_10_15T05_57_01.634897
path:
- results_2023-10-15T05-57-01.634897.parquet
- split: latest
path:
- results_2023-10-15T05-57-01.634897.parquet
---
# Dataset Card for Evaluation run of Rachneet/gpt2-xl-alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Rachneet/gpt2-xl-alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Rachneet/gpt2-xl-alpaca](https://huggingface.co/Rachneet/gpt2-xl-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Rachneet__gpt2-xl-alpaca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T05:57:01.634897](https://huggingface.co/datasets/open-llm-leaderboard/details_Rachneet__gpt2-xl-alpaca/blob/main/results_2023-10-15T05-57-01.634897.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.00576761744966443,
"em_stderr": 0.0007755000442814736,
"f1": 0.06548028523489936,
"f1_stderr": 0.001565882245526754,
"acc": 0.2845303867403315,
"acc_stderr": 0.00695889831166798
},
"harness|drop|3": {
"em": 0.00576761744966443,
"em_stderr": 0.0007755000442814736,
"f1": 0.06548028523489936,
"f1_stderr": 0.001565882245526754
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.569060773480663,
"acc_stderr": 0.01391779662333596
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
m-ric/agents_small_benchmark | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: task
dtype: string
splits:
- name: train
num_bytes: 29612
num_examples: 100
download_size: 25208
dataset_size: 29612
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tasksource/SDOH-NLI | ---
license: cc-by-4.0
task_categories:
- text-classification
language:
- en
---
SDOH-NLI is a natural language inference dataset containing ~30k premise-hypothesis pairs with binary entailment labels in the domain of social and behavioral determinants of health.
```
@misc{lelkes2023sdohnli,
title={SDOH-NLI: a Dataset for Inferring Social Determinants of Health from Clinical Notes},
author={Adam D. Lelkes and Eric Loreaux and Tal Schuster and Ming-Jun Chen and Alvin Rajkomar},
year={2023},
eprint={2310.18431},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
sumeersaifi/customer_reviews | ---
dataset_info:
features:
- name: Review
dtype: string
- name: Rating
dtype: int64
splits:
- name: train
num_bytes: 82
num_examples: 2
download_size: 1477
dataset_size: 82
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.