datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
gaodrew/roco-65k-256px | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 675508431.156
num_examples: 65418
download_size: 651136006
dataset_size: 675508431.156
---
# Dataset Card for "roco-65k-256px"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TuringsSolutions/PFAF750 | ---
license: mit
---
|
ybendou/text2stats | ---
license: apache-2.0
--- |
andersonbcdefg/misc_sts_pairs_v2 | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: __index_level_0__
dtype: int64
- name: jaccard
dtype: float64
- name: sim
dtype: float64
splits:
- name: train
num_bytes: 1859391440.7831883
num_examples: 13184276
download_size: 1277788105
dataset_size: 1859391440.7831883
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-glue-4805e982-13995915 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: natural_language_inference
model: autoevaluate/glue-qqp
metrics: []
dataset_name: glue
dataset_config: qqp
dataset_split: validation
col_mapping:
text1: question1
text2: question2
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Natural Language Inference
* Model: autoevaluate/glue-qqp
* Dataset: glue
* Config: qqp
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
nampdn-ai/tiny-code-textbooks | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-generation
language:
- en
tags:
- code
pretty_name: Tiny Code Textbooks
size_categories:
- 100K<n<1M
---
# Code Explanation Textbooks
A collection of 207k synthetic code with explanation as a tiny textbook. Filtered from the-stack, each programming language contains few thousands samples. I only choose the best meaningful code to generate synthetic textbook. |
davidfant/rapidapi-example-responses-tokenized-bart | ---
dataset_info:
features:
- name: id
dtype: string
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: category
dtype: string
splits:
- name: train
num_bytes: 167674923.4914025
num_examples: 45170
- name: test
num_bytes: 18630959.5085975
num_examples: 5019
download_size: 65550667
dataset_size: 186305883.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "rapidapi-example-responses-tokenized-bart"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MoritzLaurer/cap_sotu_simple_sample | ---
dataset_info:
features:
- name: text
dtype: string
- name: labels
dtype: string
- name: label_cap2
dtype: int64
- name: label_cap2_text
dtype: string
- name: label_cap4
dtype: int64
- name: year
dtype: int64
- name: president
dtype: string
- name: pres_party
dtype: int64
- name: id_original
dtype: int64
- name: text_original
dtype: string
- name: text_preceding
dtype: string
- name: text_following
dtype: string
- name: doc_id
dtype: int64
- name: idx
dtype: int64
splits:
- name: train
num_bytes: 594625
num_examples: 1000
download_size: 328409
dataset_size: 594625
---
# Dataset Card for "cap_sotu_simple_sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
July24/Test_02_02 | ---
dataset_info:
features:
- name: text
dtype: string
- name: distances
sequence: float64
- name: calculate_confidence
dtype: float64
- name: label
dtype: int64
splits:
- name: test
num_bytes: 105542
num_examples: 463
download_size: 65348
dataset_size: 105542
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
FreedomIntelligence/sharegpt-indonesian | ---
license: apache-2.0
---
Indonesian ShareGPT data translated by gpt-3.5-turbo.
The dataset is used in the research related to [MultilingualSIFT](https://github.com/FreedomIntelligence/MultilingualSIFT). |
CyberHarem/irving_amelia_renaiflops | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Irving Amelia
This is the dataset of Irving Amelia, containing 165 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 165 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 364 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 399 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 165 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 165 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 165 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 364 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 364 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 279 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 399 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 399 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
cannlytics/cannabis_sales | ---
license: cc-by-4.0
---
|
kaleemWaheed/twitter_dataset_1712985477 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 100446
num_examples: 222
download_size: 28274
dataset_size: 100446
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-project-d42d3c12-7815012 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xtreme
eval_info:
task: entity_extraction
model: edwardjross/xlm-roberta-base-finetuned-panx-de
metrics: []
dataset_name: xtreme
dataset_config: PAN-X.de
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: edwardjross/xlm-roberta-base-finetuned-panx-de
* Dataset: xtreme
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
avalonai/id_only_corpora | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 36496496
num_examples: 989529
download_size: 25023569
dataset_size: 36496496
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
damian0815/perlin-1k | ---
license: mit
size_categories:
- n<1K
---
1000 images at 768x768 of 3 octaves of Perlin noise, at various brightness and contrast levels. eg:




|
open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-SFT-v0.1-sharded | ---
pretty_name: Evaluation run of fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded](https://huggingface.co/fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-SFT-v0.1-sharded\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-18T14:17:05.941006](https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-SFT-v0.1-sharded/blob/main/results_2024-01-18T14-17-05.941006.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4090385972648381,\n\
\ \"acc_stderr\": 0.0341788866124078,\n \"acc_norm\": 0.41460362597527234,\n\
\ \"acc_norm_stderr\": 0.03503821200342104,\n \"mc1\": 0.3084455324357405,\n\
\ \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.4504265715815864,\n\
\ \"mc2_stderr\": 0.014995497671563001\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4206484641638225,\n \"acc_stderr\": 0.014426211252508403,\n\
\ \"acc_norm\": 0.4590443686006826,\n \"acc_norm_stderr\": 0.01456229107360123\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5291774546903008,\n\
\ \"acc_stderr\": 0.004981278326428013,\n \"acc_norm\": 0.714299940250946,\n\
\ \"acc_norm_stderr\": 0.004508239594503835\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3815789473684211,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.3815789473684211,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4981132075471698,\n \"acc_stderr\": 0.030772653642075664,\n\
\ \"acc_norm\": 0.4981132075471698,\n \"acc_norm_stderr\": 0.030772653642075664\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3819444444444444,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.3819444444444444,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322674,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322674\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3815028901734104,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.3815028901734104,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.03141082197596239,\n\
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.03141082197596239\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.02326651221373057,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02326651221373057\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.42258064516129035,\n\
\ \"acc_stderr\": 0.02810096472427264,\n \"acc_norm\": 0.42258064516129035,\n\
\ \"acc_norm_stderr\": 0.02810096472427264\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n\
\ \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5696969696969697,\n \"acc_stderr\": 0.03866225962879077,\n\
\ \"acc_norm\": 0.5696969696969697,\n \"acc_norm_stderr\": 0.03866225962879077\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5454545454545454,\n \"acc_stderr\": 0.03547601494006937,\n \"\
acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.03547601494006937\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5595854922279793,\n \"acc_stderr\": 0.035827245300360945,\n\
\ \"acc_norm\": 0.5595854922279793,\n \"acc_norm_stderr\": 0.035827245300360945\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.37435897435897436,\n \"acc_stderr\": 0.024537591572830513,\n\
\ \"acc_norm\": 0.37435897435897436,\n \"acc_norm_stderr\": 0.024537591572830513\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184407,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184407\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36554621848739494,\n \"acc_stderr\": 0.031282177063684594,\n\
\ \"acc_norm\": 0.36554621848739494,\n \"acc_norm_stderr\": 0.031282177063684594\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4917431192660551,\n \"acc_stderr\": 0.021434399918214338,\n \"\
acc_norm\": 0.4917431192660551,\n \"acc_norm_stderr\": 0.021434399918214338\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2916666666666667,\n \"acc_stderr\": 0.030998666304560524,\n \"\
acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.030998666304560524\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4803921568627451,\n \"acc_stderr\": 0.03506612560524867,\n \"\
acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.03506612560524867\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4767932489451477,\n \"acc_stderr\": 0.032512152011410174,\n \
\ \"acc_norm\": 0.4767932489451477,\n \"acc_norm_stderr\": 0.032512152011410174\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4260089686098655,\n\
\ \"acc_stderr\": 0.03318833286217281,\n \"acc_norm\": 0.4260089686098655,\n\
\ \"acc_norm_stderr\": 0.03318833286217281\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4122137404580153,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.4122137404580153,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.42592592592592593,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.42592592592592593,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4171779141104294,\n \"acc_stderr\": 0.038741028598180814,\n\
\ \"acc_norm\": 0.4171779141104294,\n \"acc_norm_stderr\": 0.038741028598180814\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.049505043821289195,\n\
\ \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.049505043821289195\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6196581196581197,\n\
\ \"acc_stderr\": 0.03180425204384099,\n \"acc_norm\": 0.6196581196581197,\n\
\ \"acc_norm_stderr\": 0.03180425204384099\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5427841634738186,\n\
\ \"acc_stderr\": 0.01781438523853444,\n \"acc_norm\": 0.5427841634738186,\n\
\ \"acc_norm_stderr\": 0.01781438523853444\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.41329479768786126,\n \"acc_stderr\": 0.026511261369409247,\n\
\ \"acc_norm\": 0.41329479768786126,\n \"acc_norm_stderr\": 0.026511261369409247\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961447,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961447\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4673202614379085,\n \"acc_stderr\": 0.028568699752225868,\n\
\ \"acc_norm\": 0.4673202614379085,\n \"acc_norm_stderr\": 0.028568699752225868\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5241157556270096,\n\
\ \"acc_stderr\": 0.028365041542564577,\n \"acc_norm\": 0.5241157556270096,\n\
\ \"acc_norm_stderr\": 0.028365041542564577\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.41975308641975306,\n \"acc_stderr\": 0.027460099557005138,\n\
\ \"acc_norm\": 0.41975308641975306,\n \"acc_norm_stderr\": 0.027460099557005138\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611317,\n \
\ \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611317\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.31681877444589307,\n\
\ \"acc_stderr\": 0.011882349954723011,\n \"acc_norm\": 0.31681877444589307,\n\
\ \"acc_norm_stderr\": 0.011882349954723011\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.029029422815681393,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.029029422815681393\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4264705882352941,\n \"acc_stderr\": 0.02000791273935936,\n \
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.02000791273935936\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4909090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.4909090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4326530612244898,\n \"acc_stderr\": 0.031717528240626645,\n\
\ \"acc_norm\": 0.4326530612244898,\n \"acc_norm_stderr\": 0.031717528240626645\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5174129353233831,\n\
\ \"acc_stderr\": 0.03533389234739244,\n \"acc_norm\": 0.5174129353233831,\n\
\ \"acc_norm_stderr\": 0.03533389234739244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3493975903614458,\n\
\ \"acc_stderr\": 0.037117251907407486,\n \"acc_norm\": 0.3493975903614458,\n\
\ \"acc_norm_stderr\": 0.037117251907407486\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5672514619883041,\n \"acc_stderr\": 0.03799978644370607,\n\
\ \"acc_norm\": 0.5672514619883041,\n \"acc_norm_stderr\": 0.03799978644370607\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3084455324357405,\n\
\ \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.4504265715815864,\n\
\ \"mc2_stderr\": 0.014995497671563001\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6882399368587214,\n \"acc_stderr\": 0.013018571197638548\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.025018953752843062,\n \
\ \"acc_stderr\": 0.004302045046564279\n }\n}\n```"
repo_url: https://huggingface.co/fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|arc:challenge|25_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|gsm8k|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hellaswag|10_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T14-17-05.941006.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T14-17-05.941006.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- '**/details_harness|winogrande|5_2024-01-18T14-17-05.941006.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-18T14-17-05.941006.parquet'
- config_name: results
data_files:
- split: 2024_01_18T14_17_05.941006
path:
- results_2024-01-18T14-17-05.941006.parquet
- split: latest
path:
- results_2024-01-18T14-17-05.941006.parquet
---
# Dataset Card for Evaluation run of fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded](https://huggingface.co/fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-SFT-v0.1-sharded",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T14:17:05.941006](https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-SFT-v0.1-sharded/blob/main/results_2024-01-18T14-17-05.941006.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4090385972648381,
"acc_stderr": 0.0341788866124078,
"acc_norm": 0.41460362597527234,
"acc_norm_stderr": 0.03503821200342104,
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.4504265715815864,
"mc2_stderr": 0.014995497671563001
},
"harness|arc:challenge|25": {
"acc": 0.4206484641638225,
"acc_stderr": 0.014426211252508403,
"acc_norm": 0.4590443686006826,
"acc_norm_stderr": 0.01456229107360123
},
"harness|hellaswag|10": {
"acc": 0.5291774546903008,
"acc_stderr": 0.004981278326428013,
"acc_norm": 0.714299940250946,
"acc_norm_stderr": 0.004508239594503835
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3815789473684211,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.3815789473684211,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4981132075471698,
"acc_stderr": 0.030772653642075664,
"acc_norm": 0.4981132075471698,
"acc_norm_stderr": 0.030772653642075664
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3819444444444444,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.3819444444444444,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322674,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322674
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3815028901734104,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.3815028901734104,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.03141082197596239,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.03141082197596239
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02326651221373057,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02326651221373057
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.42258064516129035,
"acc_stderr": 0.02810096472427264,
"acc_norm": 0.42258064516129035,
"acc_norm_stderr": 0.02810096472427264
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5696969696969697,
"acc_stderr": 0.03866225962879077,
"acc_norm": 0.5696969696969697,
"acc_norm_stderr": 0.03866225962879077
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.03547601494006937,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.03547601494006937
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5595854922279793,
"acc_stderr": 0.035827245300360945,
"acc_norm": 0.5595854922279793,
"acc_norm_stderr": 0.035827245300360945
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.37435897435897436,
"acc_stderr": 0.024537591572830513,
"acc_norm": 0.37435897435897436,
"acc_norm_stderr": 0.024537591572830513
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184407,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184407
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36554621848739494,
"acc_stderr": 0.031282177063684594,
"acc_norm": 0.36554621848739494,
"acc_norm_stderr": 0.031282177063684594
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4917431192660551,
"acc_stderr": 0.021434399918214338,
"acc_norm": 0.4917431192660551,
"acc_norm_stderr": 0.021434399918214338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.030998666304560524,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.030998666304560524
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.03506612560524867,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.03506612560524867
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4767932489451477,
"acc_stderr": 0.032512152011410174,
"acc_norm": 0.4767932489451477,
"acc_norm_stderr": 0.032512152011410174
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4260089686098655,
"acc_stderr": 0.03318833286217281,
"acc_norm": 0.4260089686098655,
"acc_norm_stderr": 0.03318833286217281
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4122137404580153,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.4122137404580153,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068383,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068383
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4171779141104294,
"acc_stderr": 0.038741028598180814,
"acc_norm": 0.4171779141104294,
"acc_norm_stderr": 0.038741028598180814
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.49514563106796117,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.49514563106796117,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6196581196581197,
"acc_stderr": 0.03180425204384099,
"acc_norm": 0.6196581196581197,
"acc_norm_stderr": 0.03180425204384099
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5427841634738186,
"acc_stderr": 0.01781438523853444,
"acc_norm": 0.5427841634738186,
"acc_norm_stderr": 0.01781438523853444
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.41329479768786126,
"acc_stderr": 0.026511261369409247,
"acc_norm": 0.41329479768786126,
"acc_norm_stderr": 0.026511261369409247
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961447,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961447
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4673202614379085,
"acc_stderr": 0.028568699752225868,
"acc_norm": 0.4673202614379085,
"acc_norm_stderr": 0.028568699752225868
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5241157556270096,
"acc_stderr": 0.028365041542564577,
"acc_norm": 0.5241157556270096,
"acc_norm_stderr": 0.028365041542564577
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.41975308641975306,
"acc_stderr": 0.027460099557005138,
"acc_norm": 0.41975308641975306,
"acc_norm_stderr": 0.027460099557005138
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611317,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611317
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.31681877444589307,
"acc_stderr": 0.011882349954723011,
"acc_norm": 0.31681877444589307,
"acc_norm_stderr": 0.011882349954723011
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.029029422815681393,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.029029422815681393
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.02000791273935936,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.02000791273935936
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4326530612244898,
"acc_stderr": 0.031717528240626645,
"acc_norm": 0.4326530612244898,
"acc_norm_stderr": 0.031717528240626645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5174129353233831,
"acc_stderr": 0.03533389234739244,
"acc_norm": 0.5174129353233831,
"acc_norm_stderr": 0.03533389234739244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3493975903614458,
"acc_stderr": 0.037117251907407486,
"acc_norm": 0.3493975903614458,
"acc_norm_stderr": 0.037117251907407486
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5672514619883041,
"acc_stderr": 0.03799978644370607,
"acc_norm": 0.5672514619883041,
"acc_norm_stderr": 0.03799978644370607
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.4504265715815864,
"mc2_stderr": 0.014995497671563001
},
"harness|winogrande|5": {
"acc": 0.6882399368587214,
"acc_stderr": 0.013018571197638548
},
"harness|gsm8k|5": {
"acc": 0.025018953752843062,
"acc_stderr": 0.004302045046564279
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pharaouk/cortex_alpha | ---
dataset_info:
features:
- name: prompts
dtype: string
- name: responses
dtype: string
splits:
- name: train
num_bytes: 73337509
num_examples: 32374
download_size: 36465857
dataset_size: 73337509
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Codebmk/opus_ubuntu_lg_to_en | ---
license: bsd-3-clause
task_categories:
- translation
language:
- lg
- en
--- |
SkunkworksAI-shared/concatenated_1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
splits:
- name: train
num_bytes: 4580744904
num_examples: 2527636
download_size: 2447560359
dataset_size: 4580744904
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "concatenated_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
renumics/spotlight-boolq-enrichment | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: question.embedding
sequence: float32
length: 2
- name: passage.embedding
sequence: float32
length: 2
splits:
- name: train
num_bytes: 150832
num_examples: 9427
- name: validation
num_bytes: 52320
num_examples: 3270
download_size: 284725
dataset_size: 203152
---
# Dataset Card for "spotlight-boolq-enrichment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fxmeng/chess_annotation | ---
configs:
- config_name: default
data_files:
- split: train_mc
path: data/train_mc-*
- split: test_mc
path: data/test_mc-*
- split: train_open
path: data/train_open-*
- split: test_open
path: data/test_open-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: board_svg
dtype: string
splits:
- name: train_mc
num_bytes: 89387563
num_examples: 2899
- name: test_mc
num_bytes: 3081009
num_examples: 100
- name: train_open
num_bytes: 88156600
num_examples: 2899
- name: test_open
num_bytes: 3039592
num_examples: 100
download_size: 32995616
dataset_size: 183664764
---
# Dataset Card for "chess_annotation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
christykoh/imdb_es | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
splits:
- name: train
num_bytes: 10545575
num_examples: 25000
- name: test
num_bytes: 10543577
num_examples: 25000
download_size: 11415991
dataset_size: 21089152
---
# Dataset Card for "imdb_es"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_129 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1129960628.0
num_examples: 221909
download_size: 1154692831
dataset_size: 1129960628.0
---
# Dataset Card for "chunk_129"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hacktoberfest-corpus-es/colmbian_spanish_news | ---
license: cc-by-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: news_id
dtype: string
- name: news_url_absolute
dtype: string
- name: news_init_date
dtype: string
- name: news_final_date
dtype: string
- name: news_title
dtype: string
- name: news_text_content
dtype: string
- name: entailment
dtype: float64
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 262518060.61903325
num_examples: 60920
- name: test
num_bytes: 13130212.257160116
num_examples: 3047
- name: valid
num_bytes: 52503612.12380665
num_examples: 12184
download_size: 195538787
dataset_size: 328151885.0
---
|
bigcode/commits-pjj-diff | ---
dataset_info:
features:
- name: commit
dtype: string
- name: subject
dtype: string
- name: old_file
dtype: string
- name: new_file
dtype: string
- name: old_contents
dtype: string
- name: lang
dtype: string
- name: proba
dtype: float64
- name: diff
dtype: string
splits:
- name: train
num_bytes: 37542628668.41245
num_examples: 2597493
download_size: 3991157226
dataset_size: 37542628668.41245
---
# Dataset Card for "commits-pjj-diff"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LaraProGamer/LaraProGamer-dataset | ---
license: unknown
language:
- en
--- |
Ioana23/github-issues | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: due_on
dtype: 'null'
- name: closed_at
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 19510922
num_examples: 3047
download_size: 5482325
dataset_size: 19510922
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DurreSudoku/fma_small_images | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Electronic
'1': Experimental
'2': Folk
'3': Hip-Hop
'4': Instrumental
'5': International
'6': Pop
'7': Rock
splits:
- name: train
num_bytes: 647880981.875
num_examples: 31945
- name: validation
num_bytes: 81400098.625
num_examples: 3995
- name: test
num_bytes: 81537172.0
num_examples: 4000
download_size: 810704600
dataset_size: 810818252.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
ShrinivasSK/hi_en_3 | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: tgt
dtype: string
- name: src
dtype: string
splits:
- name: train
num_bytes: 6366803.4
num_examples: 18000
- name: test
num_bytes: 707422.6
num_examples: 2000
download_size: 3789240
dataset_size: 7074226.0
---
# Dataset Card for "hi_en_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
andersonbcdefg/arxiv-triples-filtered | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
- name: qp
dtype: float64
- name: qn
dtype: float64
- name: margin
dtype: float64
splits:
- name: train
num_bytes: 3028180464
num_examples: 1551662
download_size: 1829334625
dataset_size: 3028180464
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bittensor-dataset/twitter-image-dataset | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tweet_content
dtype: string
- name: image_url
dtype: string
splits:
- name: train
num_bytes: 83617693
num_examples: 288947
download_size: 37483354
dataset_size: 83617693
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
rhenoa24/HolySinRVC | ---
license: openrail
---
|
qianyu88/call-center-speech | ---
license: mit
---
|
open-llm-leaderboard/details_abhinand__malayalam-llama-7b-instruct-v0.1 | ---
pretty_name: Evaluation run of abhinand/malayalam-llama-7b-instruct-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abhinand/malayalam-llama-7b-instruct-v0.1](https://huggingface.co/abhinand/malayalam-llama-7b-instruct-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhinand__malayalam-llama-7b-instruct-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-23T18:17:56.469419](https://huggingface.co/datasets/open-llm-leaderboard/details_abhinand__malayalam-llama-7b-instruct-v0.1/blob/main/results_2024-01-23T18-17-56.469419.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24108004755305937,\n\
\ \"acc_stderr\": 0.029966410902590897,\n \"acc_norm\": 0.24113007863184896,\n\
\ \"acc_norm_stderr\": 0.030748896883397367,\n \"mc1\": 0.28151774785801714,\n\
\ \"mc1_stderr\": 0.01574402724825605,\n \"mc2\": 0.47114186838793193,\n\
\ \"mc2_stderr\": 0.015324555786314642\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3575085324232082,\n \"acc_stderr\": 0.014005494275916573,\n\
\ \"acc_norm\": 0.3720136518771331,\n \"acc_norm_stderr\": 0.014124597881844461\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5427205735909182,\n\
\ \"acc_stderr\": 0.004971534874389941,\n \"acc_norm\": 0.67805218084047,\n\
\ \"acc_norm_stderr\": 0.0046626822330937704\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.28151774785801714,\n \"mc1_stderr\": 0.01574402724825605,\n\
\ \"mc2\": 0.47114186838793193,\n \"mc2_stderr\": 0.015324555786314642\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.6290449881610103,\n\
\ \"acc_stderr\": 0.01357639990223157\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/abhinand/malayalam-llama-7b-instruct-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|arc:challenge|25_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|arc:challenge|25_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|gsm8k|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|gsm8k|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hellaswag|10_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hellaswag|10_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T15-03-09.994795.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T18-17-56.469419.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T18-17-56.469419.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- '**/details_harness|winogrande|5_2024-01-23T15-03-09.994795.parquet'
- split: 2024_01_23T18_17_56.469419
path:
- '**/details_harness|winogrande|5_2024-01-23T18-17-56.469419.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-23T18-17-56.469419.parquet'
- config_name: results
data_files:
- split: 2024_01_23T15_03_09.994795
path:
- results_2024-01-23T15-03-09.994795.parquet
- split: 2024_01_23T18_17_56.469419
path:
- results_2024-01-23T18-17-56.469419.parquet
- split: latest
path:
- results_2024-01-23T18-17-56.469419.parquet
---
# Dataset Card for Evaluation run of abhinand/malayalam-llama-7b-instruct-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhinand/malayalam-llama-7b-instruct-v0.1](https://huggingface.co/abhinand/malayalam-llama-7b-instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhinand__malayalam-llama-7b-instruct-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T18:17:56.469419](https://huggingface.co/datasets/open-llm-leaderboard/details_abhinand__malayalam-llama-7b-instruct-v0.1/blob/main/results_2024-01-23T18-17-56.469419.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24108004755305937,
"acc_stderr": 0.029966410902590897,
"acc_norm": 0.24113007863184896,
"acc_norm_stderr": 0.030748896883397367,
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.47114186838793193,
"mc2_stderr": 0.015324555786314642
},
"harness|arc:challenge|25": {
"acc": 0.3575085324232082,
"acc_stderr": 0.014005494275916573,
"acc_norm": 0.3720136518771331,
"acc_norm_stderr": 0.014124597881844461
},
"harness|hellaswag|10": {
"acc": 0.5427205735909182,
"acc_stderr": 0.004971534874389941,
"acc_norm": 0.67805218084047,
"acc_norm_stderr": 0.0046626822330937704
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28151774785801714,
"mc1_stderr": 0.01574402724825605,
"mc2": 0.47114186838793193,
"mc2_stderr": 0.015324555786314642
},
"harness|winogrande|5": {
"acc": 0.6290449881610103,
"acc_stderr": 0.01357639990223157
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
few_rel | ---
annotations_creators:
- crowdsourced
- machine-generated
language_creators:
- found
language:
- en
license:
- mit
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
- n<1K
source_datasets:
- original
task_categories:
- other
task_ids: []
paperswithcode_id: fewrel
pretty_name: Few-Shot Relation Classification Dataset
tags:
- relation-extraction
dataset_info:
- config_name: default
features:
- name: relation
dtype: string
- name: tokens
sequence: string
- name: head
struct:
- name: text
dtype: string
- name: type
dtype: string
- name: indices
sequence:
sequence: int64
- name: tail
struct:
- name: text
dtype: string
- name: type
dtype: string
- name: indices
sequence:
sequence: int64
- name: names
sequence: string
splits:
- name: train_wiki
num_bytes: 19923155
num_examples: 44800
- name: val_nyt
num_bytes: 1385642
num_examples: 2500
- name: val_pubmed
num_bytes: 488502
num_examples: 1000
- name: val_semeval
num_bytes: 2646249
num_examples: 8851
- name: val_wiki
num_bytes: 5147348
num_examples: 11200
- name: pubmed_unsupervised
num_bytes: 1117703
num_examples: 2500
download_size: 22674323
dataset_size: 30708599
- config_name: pid2name
features:
- name: relation
dtype: string
- name: names
sequence: string
splits:
- name: pid2name
num_bytes: 81607
num_examples: 744
download_size: 22674323
dataset_size: 81607
config_names:
- default
- pid2name
---
# Dataset Card for few_rel
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [GitHub Page](https://thunlp.github.io/)
- **Repository:** [GitHub](https://github.com/thunlp/FewRel)
- **Paper:** [FewRel](https://arxiv.org/abs/1810.10147), [FewRel 2.0](https://arxiv.org/abs/1910.07124)
- **Leaderboard:** [GitHub Leaderboard](https://thunlp.github.io/fewrel.html)
- **Point of Contact:** [Needs More Information]
### Dataset Summary
FewRel is a large-scale few-shot relation extraction dataset, which contains more than one hundred relations and tens of thousands of annotated instances cross different domains.
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
The dataset contaings English text, as used by writers on Wikipedia, and crowdsourced English annotations.
## Dataset Structure
### Data Instances
An instance from `train_wiki` split:
```
{'head': {'indices': [[16]], 'text': 'tjq', 'type': 'Q1331049'}, 'names': ['place served by transport hub', 'territorial entity or entities served by this transport hub (airport, train station, etc.)'], 'relation': 'P931', 'tail': {'indices': [[13, 14]], 'text': 'tanjung pandan', 'type': 'Q3056359'}, 'tokens': ['Merpati', 'flight', '106', 'departed', 'Jakarta', '(', 'CGK', ')', 'on', 'a', 'domestic', 'flight', 'to', 'Tanjung', 'Pandan', '(', 'TJQ', ')', '.']}
```
### Data Fields
For `default`:
- `relation`: a `string` feature containing PID of the relation.
- `tokens`: a `list` of `string` features containing tokens for the text.
- `head`: a dictionary containing:
- `text`: a `string` feature representing the head entity.
- `type`: a `string` feature representing the type of the head entity.
- `indices`: a `list` containing `list` of token indices.
- `tail`: a dictionary containing:
- `text`: a `string` feature representing the tail entity.
- `type`: a `string` feature representing the type of the tail entity.
- `indices`: a `list` containing `list` of token indices.
- `names`: a `list` of `string` features containing relation names. For `pubmed_unsupervised` split, this is set to a `list` with an empty `string`. For `val_semeval` and `val_pubmed` split, this is set to a `list` with the `string` from the `relation` field.
### Data Splits
`train_wiki`: 44800
`val_nyt`: 2500
`val_pubmed`: 1000
`val_semeval`: 8851
`val_wiki`: 11200
`pubmed_unsupervised`: 2500
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
For FewRel:
Han, Xu and Zhu, Hao and Yu, Pengfei and Wang, Ziyun and Yao, Yuan and Liu, Zhiyuan and Sun, Maosong
For FewRel 2.0:
Gao, Tianyu and Han, Xu and Zhu, Hao and Liu, Zhiyuan and Li, Peng and Sun, Maosong and Zhou, Jie
### Licensing Information
```
MIT License
Copyright (c) 2018 THUNLP
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
```
### Citation Information
```
@inproceedings{han-etal-2018-fewrel,
title = "{F}ew{R}el: A Large-Scale Supervised Few-Shot Relation Classification Dataset with State-of-the-Art Evaluation",
author = "Han, Xu and Zhu, Hao and Yu, Pengfei and Wang, Ziyun and Yao, Yuan and Liu, Zhiyuan and Sun, Maosong",
booktitle = "Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing",
month = oct # "-" # nov,
year = "2018",
address = "Brussels, Belgium",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/D18-1514",
doi = "10.18653/v1/D18-1514",
pages = "4803--4809"
}
```
```
@inproceedings{gao-etal-2019-fewrel,
title = "{F}ew{R}el 2.0: Towards More Challenging Few-Shot Relation Classification",
author = "Gao, Tianyu and Han, Xu and Zhu, Hao and Liu, Zhiyuan and Li, Peng and Sun, Maosong and Zhou, Jie",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)",
month = nov,
year = "2019",
address = "Hong Kong, China",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/D19-1649",
doi = "10.18653/v1/D19-1649",
pages = "6251--6256"
}
```
### Contributions
Thanks to [@gchhablani](https://github.com/gchhablani) for adding this dataset. |
themanas021/chasat-algebra-sub02 | ---
license: mit
---
|
La-matrice/french_temperatures_420M | ---
configs:
- config_name: normalized_dataset
data_files: dataset_normalized.parquet
- config_name: original_dataset
data_files: dataset.parquet
task_categories:
- time-series-forecasting
language:
- fr
- en
tags:
- temperature
size_categories:
- 100M<n<1B
---
## This dataset offers hourly temperature records from all France metropolitan and overseas weather stations from 1777 to March 12, 2024
The data comes from the Météo France database, made available as open source since January 1, 2024
|Features | Description |
|-------------|--------------------------------------------------------------------------------|
| LAT | Latitude, negative south (in degrees and millionths of degree) |
| LON | Longitude, negative west of GREENWICH (in degrees and millionths of a degree) |
| ALTI | Altitude of the base of the shelter or rain gauge (in mm) |
| AAAAMMJJHH | Date of measurement (year, month, day and time) |
| T | Instant temperature under cover (in °C and 1/10) |
---
Two files are available:
- the original dataset
- the normalized dataset (with basic Min-Max normalization) |
vira-chatbot/vira-dialog-acts-live | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 23507
num_examples: 571
- name: validation
num_bytes: 3165
num_examples: 71
- name: test
num_bytes: 2591
num_examples: 72
download_size: 20086
dataset_size: 29263
---
# Dataset Card for "vira-dialog-acts-live"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigscience-data/roots_ar_sanad | ---
language: ar
license: cc-by-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_ar_sanad
# sanad
- Dataset uid: `sanad`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.1312 % of total
- 1.2094 % of ar
### BigScience processing steps
#### Filters applied to: ar
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- remove_html_spans_sanad
- filter_small_docs_bytes_300
|
AbhishekJosyula/Wikisun_QA_Format | ---
language:
- en
license: apache-2.0
size_categories:
- 100K<n<1M
pretty_name: WikisumQA
dataset_info:
features:
- name: Text
dtype: string
splits:
- name: train
num_bytes: 1109803353
num_examples: 180889
download_size: 597355404
dataset_size: 1109803353
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Shirleyphd/Pug | ---
license: creativeml-openrail-m
tags:
- pytorch
- diffusers
- stable-diffusion
- text-to-image
- diffusion-models-class
- dreambooth-hackathon
- animal
widget:
- text: a photo of pug dog in a cup
---
# DreamBooth model for pugsly trained by lewtun on the Shirleyphd/Pug dataset.
This is a Stable Diffusion model fine-tuned the ccorgi concept taught to Stable Diffusion with DreamBooth.
It can be used by modifying the `instance_prompt`: **a photo of pugsly dog**
This model was created as part of the DreamBooth Hackathon 🔥. Visit the [organisation page](https://huggingface.co/dreambooth-hackathon) for instructions on how to take part!
## Description
This is a Stable Diffusion model fine-tuned on `dog` images for the animal theme.
## Usage
```python
from diffusers import StableDiffusionPipeline
pipeline = StableDiffusionPipeline.from_pretrained('Shirleyphd/Pug-dog')
image = pipeline().images[0]
image
``` |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/9f0acde5 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 176
num_examples: 10
download_size: 1332
dataset_size: 176
---
# Dataset Card for "9f0acde5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TinyPixel/f_1 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1651066476
num_examples: 1000000
download_size: 953819312
dataset_size: 1651066476
---
# Dataset Card for "f_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-high_school_world_history-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 90952
num_examples: 237
download_size: 56701
dataset_size: 90952
---
# Dataset Card for "mmlu-high_school_world_history-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atayu/cavendish_banana_classification | ---
task_categories:
- zero-shot-classification
language:
- id
- en
tags:
- classification
- fruit
pretty_name: banana classifying
size_categories:
- 100K<n<1M
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nlpso/m1_qualitative_analysis_ref_ptrn_cmbert_io | ---
language:
- fr
multilinguality:
- monolingual
task_categories:
- token-classification
---
# m1_qualitative_analysis_ref_ptrn_cmbert_io
## Introduction
This dataset was used to perform **qualitative analysis** of [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained) on **nested NER task** using Independant NER layers approach [M1].
It contains Paris trade directories entries from the 19th century.
## Dataset parameters
* Approach : M1
* Dataset type : ground-truth
* Tokenizer : [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained)
* Tagging format : IO
* Counts :
* Train : 6084
* Dev : 676
* Test : 1685
* Associated fine-tuned models :
* Level-1 : [nlpso/m1_ind_layers_ref_ptrn_cmbert_io_level_1](https://huggingface.co/nlpso/m1_ind_layers_ref_ptrn_cmbert_io_level_1)
* Level 2 : [nlpso/m1_ind_layers_ref_ptrn_cmbert_io_level_2](https://huggingface.co/nlpso/m1_ind_layers_ref_ptrn_cmbert_io_level_2)
## Entity types
Abbreviation|Entity group (level)|Description
-|-|-
O |1 & 2|Outside of a named entity
PER |1|Person or company name
ACT |1 & 2|Person or company professional activity
TITREH |2|Military or civil distinction
DESC |1|Entry full description
TITREP |2|Professionnal reward
SPAT |1|Address
LOC |2|Street name
CARDINAL |2|Street number
FT |2|Geographical feature
## How to use this dataset
```python
from datasets import load_dataset
train_dev_test = load_dataset("nlpso/m1_qualitative_analysis_ref_ptrn_cmbert_io")
|
MathDG/DocLayNet-base-law | ---
license: other
license_name: other
license_link: LICENSE
dataset_info:
features:
- name: id
dtype: string
- name: texts
sequence: string
- name: bboxes_block
sequence:
sequence: int64
- name: bboxes_line
sequence:
sequence: int64
- name: categories
sequence:
class_label:
names:
'0': Caption
'1': Footnote
'2': Formula
'3': List-item
'4': Page-footer
'5': Page-header
'6': Picture
'7': Section-header
'8': Table
'9': Text
'10': Title
- name: image
dtype: image
- name: page_hash
dtype: string
- name: original_filename
dtype: string
- name: page_no
dtype: int32
- name: num_pages
dtype: int32
- name: original_width
dtype: int32
- name: original_height
dtype: int32
- name: coco_width
dtype: int32
- name: coco_height
dtype: int32
- name: collection
dtype: string
- name: doc_category
dtype: string
splits:
- name: train
num_bytes: 437969372.61070913
num_examples: 1084
- name: validation
num_bytes: 45861372.73611111
num_examples: 117
- name: test
num_bytes: 33038984.336673345
num_examples: 86
download_size: 596745929
dataset_size: 516869729.68349355
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
UTibetNLP/tibetan_news_classification | ---
language:
- bo
---
# Tibetan News Classification Corpus
**This is the open-sourced training corpus of our [Tibetan BERT Model](https://huggingface.co/UTibetNLP/tibetan_bert).**
## Citation
Please cite our [paper](https://dl.acm.org/doi/10.1145/3548608.3559255) if you use this training corpus or the model:
```
@inproceedings{10.1145/3548608.3559255,
author = {Zhang, Jiangyan and Kazhuo, Deji and Gadeng, Luosang and Trashi, Nyima and Qun, Nuo},
title = {Research and Application of Tibetan Pre-Training Language Model Based on BERT},
year = {2022},
isbn = {9781450397179},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3548608.3559255},
doi = {10.1145/3548608.3559255},
abstract = {In recent years, pre-training language models have been widely used in the field of natural language processing, but the research on Tibetan pre-training language models is still in the exploratory stage. To promote the further development of Tibetan natural language processing and effectively solve the problem of the scarcity of Tibetan annotation data sets, the article studies the Tibetan pre-training language model based on BERT. First, given the characteristics of the Tibetan language, we constructed a data set for the BERT pre-training language model and downstream text classification tasks. Secondly, construct a small-scale Tibetan BERT pre-training language model to train it. Finally, the performance of the model was verified through the downstream task of Tibetan text classification, and an accuracy rate of 86\% was achieved on the task of text classification. Experiments show that the model we built has a significant effect on the task of Tibetan text classification.},
booktitle = {Proceedings of the 2022 2nd International Conference on Control and Intelligent Robotics},
pages = {519–524},
numpages = {6},
location = {Nanjing, China},
series = {ICCIR '22}
}
``` |
autoevaluate/autoeval-eval-ccdv__arxiv-summarization-document-47d12e-1465753970 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- ccdv/arxiv-summarization
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-base-16384-booksum-V12
metrics: []
dataset_name: ccdv/arxiv-summarization
dataset_config: document
dataset_split: test
col_mapping:
text: article
target: abstract
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-base-16384-booksum-V12
* Dataset: ccdv/arxiv-summarization
* Config: document
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
nvidia/HelpSteer | ---
license: cc-by-4.0
language:
- en
pretty_name: Helpfulness SteerLM Dataset
size_categories:
- 10K<n<100K
tags:
- human-feedback
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: helpfulness
dtype: int32
- name: correctness
dtype: int32
- name: coherence
dtype: int32
- name: complexity
dtype: int32
- name: verbosity
dtype: int32
splits:
- name: train
num_examples: 35331
- name: validation
num_examples: 1789
---
# HelpSteer: Helpfulness SteerLM Dataset
HelpSteer is an open-source Helpfulness Dataset (CC-BY-4.0) that supports aligning models to become more helpful, factually correct and coherent, while being adjustable in terms of the complexity and verbosity of its responses.
Leveraging this dataset and SteerLM, we train a Llama 2 70B to reach **7.54** on MT Bench, the highest among models trained on open-source datasets based on [MT Bench Leaderboard](https://huggingface.co/spaces/lmsys/chatbot-arena-leaderboard) as of 15 Nov 2023.
This model is available on HF at [Llama2-70B-SteerLM-Chat](https://huggingface.co/nvidia/Llama2-70B-SteerLM-Chat).
Try this model instantly for free hosted by us at [NVIDIA AI Playground](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/ai-foundation/models/llama2-70b-steerlm). You can use this in the provided UI or through a limited access API (up to 10, 000 requests within 30 days).
If you would need more requests, we demonstrate how you can set up an inference server at [Llama2-70B-SteerLM-Chat model page on HF](https://huggingface.co/nvidia/Llama2-70B-SteerLM-Chat)
You can also train a model using [NeMo Aligner](https://github.com/NVIDIA/NeMo-Aligner) following [SteerLM training user guide](https://docs.nvidia.com/nemo-framework/user-guide/latest/modelalignment/steerlm.html).
<img src="https://huggingface.co/datasets/nvidia/HelpSteer/resolve/main/mtbench_categories.png" alt="MT Bench Categories" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
HelpSteer Paper : [HelpSteer: Multi-attribute Helpfulness Dataset for SteerLM](http://arxiv.org/abs/2311.09528)
SteerLM Paper: [SteerLM: Attribute Conditioned SFT as an (User-Steerable) Alternative to RLHF](https://arxiv.org/abs/2310.05344)
## Dataset Description
HelpSteer contains 37, 120 samples, each containing a prompt, a response as well as five human-annotated attributes of the response, each ranging between 0 and 4 where higher means better for each attribute.
These attributes are:
1. **Helpfulness**: Overall helpfulness of the response to the prompt.
2. **Correctness**: Inclusion of all pertinent facts without errors.
3. **Coherence**: Consistency and clarity of expression.
4. **Complexity**: Intellectual depth required to write response (i.e. whether the response can be written by anyone with basic language competency or requires deep domain expertise).
5. **Verbosity**: Amount of detail included in the response, relative to what is asked for in the prompt.
## Example
```json
{
"prompt": "What are the three most important things to consider when deciding what technology to use to build an assist device to help an elderly person with basic needs?",
"response": "To build an assistive device to help an elderly person with basic needs, one must consider three crucial things: safety, compatibility, and ease of use. Safety is paramount, as the device must not cause harm to the user. Compatibility with the user's environment and other devices is also essential. Finally, the device must be simple enough for the elderly person to operate.",
"helpfulness": 3,
"correctness": 4,
"coherence": 4,
"complexity": 2,
"verbosity": 2
}
```
## Using the Huggingface Datasets
```python
from datasets import load_dataset
ds = load_dataset("nvidia/HelpSteer")
train = ds['train'] # len(train) = 35331 (95%)
val = ds['validation'] # len(val) = 1789 (5%)
```
## Source
1. Prompts are collected based on a mixture of template-generated (mainly for prompt involving long reference text) and human generated by Scale AI. These prompts relate to the tasks of Rewrite, Summarization, Classification, Extraction, Closed Question Answering, Open Question Answering, Generation and Brainstorming.
2. Responses are generated by an early version of an inhouse LLM. We generate up to 4 responses per prompts using sample techniques to give diverse yet reasonable responses.
3. Annotations of various attributes were done by Scale AI. Annotators rated each response on a Likert 5 scale (between 0 and 4) for each attribute (helpfulness, correctness, coherence, complexity and verbosity).
## Annotation methodology (short)
1. We engaged a select group of contractors via Scale AI. These contractors were provided with comprehensive guidelines that defined each attribute and the criteria for every rating level, together with some annotated examples. These guidelines and examples are detailed in the Appendix of the accompanying paper.
2. The annotation process involved approximately 200 U.S.-based human annotators. Candidates first underwent preliminary assignments, including assessments of English proficiency, to determine eligibility for working on the project. Subsequently, they participated in an introductory training course on the task which ended with a test that involved annotating 35 sample responses. This process ensured not only a thorough understanding of the task requirements but also the delivery of high-quality annotations.
3. Post-annotations, Scale AI performed extensive quality assurance, with each annotation reaching a minimum of two human reviews in addition to automated checks. After receiving the annotations from Scale AI, we conducted our independent quality assurance to make sure that the quality of the annotations was up to our expectations. As a result, some annotations were filtered away to retain only 37, 120 samples.
## Ethical statement
Annotators for the dataset were contracted through Scale AI. Scale AI engages the Anker Methodology, GISC Impact Sourcing Standard, and UN Sustainable Development Goals to provide a fair and competitive pay. The specific pay is calculated based on many factors, including the specific project, the specialized skillset and expertise required, regional costs of living and then transparently listed on Scale AI platform. Scale AI also provides multiple channels for questions and support, including 24/7 support teams, community discussion channels with specially trained moderators, and a “speak up” hotline where contractors can report concerns anonymously. Worker concerns can be submitted to and are reviewed by our Remotasks support team, and pay disputes are reviewed by support specialists trained in this area.
## Contact
E-Mail: [Zhilin Wang](mailto:zhilinw@nvidia.com)
## Citation
If you find this dataset useful, please cite the following works
```bibtex
@misc{wang2023helpsteer,
title={HelpSteer: Multi-attribute Helpfulness Dataset for SteerLM},
author={Zhilin Wang and Yi Dong and Jiaqi Zeng and Virginia Adams and Makesh Narsimhan Sreedhar and Daniel Egert and Olivier Delalleau and Jane Polak Scowcroft and Neel Kant and Aidan Swope and Oleksii Kuchaiev},
year={2023},
eprint={2311.09528},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
```bibtex
@misc{dong2023steerlm,
title={SteerLM: Attribute Conditioned SFT as an (User-Steerable) Alternative to RLHF},
author={Yi Dong and Zhilin Wang and Makesh Narsimhan Sreedhar and Xianchao Wu and Oleksii Kuchaiev},
year={2023},
eprint={2310.05344},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
Ammar-Azman/crawl-mufti-negeri-sembilan | ---
license: mit
language:
- ms
tags:
- fatwa
- mufti
- malaysia
---
# Details
- Source: https://muftins.gov.my/
- Scrap date: 26/08/2023 |
rocioadlc/gary | ---
license: apache-2.0
---
|
sikaponcio/aa-dataset | ---
license: afl-3.0
---
|
ihaflix1/pauloloko | ---
license: openrail
---
|
theBrokenCat/SprintDataset-0.2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2642977867.0
num_examples: 499
download_size: 2613775457
dataset_size: 2642977867.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
masakhaner | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- am
- ha
- ig
- lg
- luo
- pcm
- rw
- sw
- wo
- yo
license:
- unknown
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: MasakhaNER
dataset_info:
- config_name: amh
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-DATE
'8': I-DATE
splits:
- name: train
num_bytes: 639911
num_examples: 1750
- name: validation
num_bytes: 92753
num_examples: 250
- name: test
num_bytes: 184271
num_examples: 500
download_size: 571951
dataset_size: 916935
- config_name: hau
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-DATE
'8': I-DATE
splits:
- name: train
num_bytes: 929848
num_examples: 1912
- name: validation
num_bytes: 139503
num_examples: 276
- name: test
num_bytes: 282971
num_examples: 552
download_size: 633372
dataset_size: 1352322
- config_name: ibo
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-DATE
'8': I-DATE
splits:
- name: train
num_bytes: 749196
num_examples: 2235
- name: validation
num_bytes: 110572
num_examples: 320
- name: test
num_bytes: 222192
num_examples: 638
download_size: 515415
dataset_size: 1081960
- config_name: kin
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-DATE
'8': I-DATE
splits:
- name: train
num_bytes: 878746
num_examples: 2116
- name: validation
num_bytes: 120998
num_examples: 302
- name: test
num_bytes: 258638
num_examples: 605
download_size: 633024
dataset_size: 1258382
- config_name: lug
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-DATE
'8': I-DATE
splits:
- name: train
num_bytes: 611917
num_examples: 1428
- name: validation
num_bytes: 70058
num_examples: 200
- name: test
num_bytes: 183063
num_examples: 407
download_size: 445755
dataset_size: 865038
- config_name: luo
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-DATE
'8': I-DATE
splits:
- name: train
num_bytes: 314995
num_examples: 644
- name: validation
num_bytes: 43506
num_examples: 92
- name: test
num_bytes: 87716
num_examples: 186
download_size: 213281
dataset_size: 446217
- config_name: pcm
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-DATE
'8': I-DATE
splits:
- name: train
num_bytes: 868229
num_examples: 2124
- name: validation
num_bytes: 126829
num_examples: 306
- name: test
num_bytes: 262185
num_examples: 600
download_size: 572054
dataset_size: 1257243
- config_name: swa
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-DATE
'8': I-DATE
splits:
- name: train
num_bytes: 1001120
num_examples: 2109
- name: validation
num_bytes: 128563
num_examples: 300
- name: test
num_bytes: 272108
num_examples: 604
download_size: 686313
dataset_size: 1401791
- config_name: wol
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-DATE
'8': I-DATE
splits:
- name: train
num_bytes: 602076
num_examples: 1871
- name: validation
num_bytes: 71535
num_examples: 267
- name: test
num_bytes: 191484
num_examples: 539
download_size: 364463
dataset_size: 865095
- config_name: yor
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-DATE
'8': I-DATE
splits:
- name: train
num_bytes: 1016741
num_examples: 2171
- name: validation
num_bytes: 127415
num_examples: 305
- name: test
num_bytes: 359519
num_examples: 645
download_size: 751510
dataset_size: 1503675
config_names:
- am
- ha
- ig
- lg
- luo
- pcm
- rw
- sw
- wo
- yo
---
# Dataset Card for MasakhaNER
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [homepage](https://github.com/masakhane-io/masakhane-ner)
- **Repository:** [github](https://github.com/masakhane-io/masakhane-ner)
- **Paper:** [paper](https://arxiv.org/abs/2103.11811)
- **Point of Contact:** [Masakhane](https://www.masakhane.io/) or didelani@lsv.uni-saarland.de
### Dataset Summary
MasakhaNER is the first large publicly available high-quality dataset for named entity recognition (NER) in ten African languages.
Named entities are phrases that contain the names of persons, organizations, locations, times and quantities. Example:
[PER Wolff] , currently a journalist in [LOC Argentina] , played with [PER Del Bosque] in the final years of the seventies in [ORG Real Madrid] .
MasakhaNER is a named entity dataset consisting of PER, ORG, LOC, and DATE entities annotated by Masakhane for ten African languages:
- Amharic
- Hausa
- Igbo
- Kinyarwanda
- Luganda
- Luo
- Nigerian-Pidgin
- Swahili
- Wolof
- Yoruba
The train/validation/test sets are available for all the ten languages.
For more details see https://arxiv.org/abs/2103.11811
### Supported Tasks and Leaderboards
[More Information Needed]
- `named-entity-recognition`: The performance in this task is measured with [F1](https://huggingface.co/metrics/f1) (higher is better). A named entity is correct only if it is an exact match of the corresponding entity in the data.
### Languages
There are ten languages available :
- Amharic (amh)
- Hausa (hau)
- Igbo (ibo)
- Kinyarwanda (kin)
- Luganda (kin)
- Luo (luo)
- Nigerian-Pidgin (pcm)
- Swahili (swa)
- Wolof (wol)
- Yoruba (yor)
## Dataset Structure
### Data Instances
The examples look like this for Yorùbá:
```
from datasets import load_dataset
data = load_dataset('masakhaner', 'yor')
# Please, specify the language code
# A data point consists of sentences seperated by empty line and tab-seperated tokens and tags.
{'id': '0',
'ner_tags': [B-DATE, I-DATE, 0, 0, 0, 0, 0, B-PER, I-PER, I-PER, O, O, O, O],
'tokens': ['Wákàtí', 'méje', 'ti', 'ré', 'kọjá', 'lọ', 'tí', 'Luis', 'Carlos', 'Díaz', 'ti', 'di', 'awati', '.']
}
```
### Data Fields
- `id`: id of the sample
- `tokens`: the tokens of the example text
- `ner_tags`: the NER tags of each token
The NER tags correspond to this list:
```
"O", "B-PER", "I-PER", "B-ORG", "I-ORG", "B-LOC", "I-LOC", "B-DATE", "I-DATE",
```
In the NER tags, a B denotes the first item of a phrase and an I any non-initial word. There are four types of phrases: person names (PER), organizations (ORG), locations (LOC) and dates & time (DATE).
It is assumed that named entities are non-recursive and non-overlapping. In case a named entity is embedded in another named entity usually, only the top level entity is marked.
### Data Splits
For all languages, there are three splits.
The original splits were named `train`, `dev` and `test` and they correspond to the `train`, `validation` and `test` splits.
The splits have the following sizes :
| Language | train | validation | test |
|-----------------|------:|-----------:|-----:|
| Amharic | 1750 | 250 | 500 |
| Hausa | 1903 | 272 | 545 |
| Igbo | 2233 | 319 | 638 |
| Kinyarwanda | 2110 | 301 | 604 |
| Luganda | 2003 | 200 | 401 |
| Luo | 644 | 92 | 185 |
| Nigerian-Pidgin | 2100 | 300 | 600 |
| Swahili | 2104 | 300 | 602 |
| Wolof | 1871 | 267 | 536 |
| Yoruba | 2124 | 303 | 608 |
## Dataset Creation
### Curation Rationale
The dataset was introduced to introduce new resources to ten languages that were under-served for natural language processing.
[More Information Needed]
### Source Data
The source of the data is from the news domain, details can be found here https://arxiv.org/abs/2103.11811
#### Initial Data Collection and Normalization
The articles were word-tokenized, information on the exact pre-processing pipeline is unavailable.
#### Who are the source language producers?
The source language was produced by journalists and writers employed by the news agency and newspaper mentioned above.
### Annotations
#### Annotation process
Details can be found here https://arxiv.org/abs/2103.11811
#### Who are the annotators?
Annotators were recruited from [Masakhane](https://www.masakhane.io/)
### Personal and Sensitive Information
The data is sourced from newspaper source and only contains mentions of public figures or individuals
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
Users should keep in mind that the dataset only contains news text, which might limit the applicability of the developed systems to other domains.
## Additional Information
### Dataset Curators
### Licensing Information
The licensing status of the data is CC 4.0 Non-Commercial
### Citation Information
Provide the [BibTex](http://www.bibtex.org/)-formatted reference for the dataset. For example:
```
@article{Adelani2021MasakhaNERNE,
title={MasakhaNER: Named Entity Recognition for African Languages},
author={D. Adelani and Jade Abbott and Graham Neubig and Daniel D'Souza and Julia Kreutzer and Constantine Lignos
and Chester Palen-Michel and Happy Buzaaba and Shruti Rijhwani and Sebastian Ruder and Stephen Mayhew and
Israel Abebe Azime and S. Muhammad and Chris C. Emezue and Joyce Nakatumba-Nabende and Perez Ogayo and
Anuoluwapo Aremu and Catherine Gitau and Derguene Mbaye and J. Alabi and Seid Muhie Yimam and Tajuddeen R. Gwadabe and
Ignatius Ezeani and Rubungo Andre Niyongabo and Jonathan Mukiibi and V. Otiende and Iroro Orife and Davis David and
Samba Ngom and Tosin P. Adewumi and Paul Rayson and Mofetoluwa Adeyemi and Gerald Muriuki and Emmanuel Anebi and
C. Chukwuneke and N. Odu and Eric Peter Wairagala and S. Oyerinde and Clemencia Siro and Tobius Saul Bateesa and
Temilola Oloyede and Yvonne Wambui and Victor Akinode and Deborah Nabagereka and Maurice Katusiime and
Ayodele Awokoya and Mouhamadane Mboup and D. Gebreyohannes and Henok Tilaye and Kelechi Nwaike and Degaga Wolde and
Abdoulaye Faye and Blessing Sibanda and Orevaoghene Ahia and Bonaventure F. P. Dossou and Kelechi Ogueji and
Thierno Ibrahima Diop and A. Diallo and Adewale Akinfaderin and T. Marengereke and Salomey Osei},
journal={ArXiv},
year={2021},
volume={abs/2103.11811}
}
```
### Contributions
Thanks to [@dadelani](https://github.com/dadelani) for adding this dataset. |
kartikagg98/HINMIX_bn-en | ---
dataset_info:
- config_name: lcsalign-bn
features:
- name: text
dtype: string
splits:
- name: valid
num_bytes: 184113
num_examples: 1001
- name: test
num_bytes: 183116
num_examples: 1001
download_size: 173769
dataset_size: 367229
- config_name: lcsalign-bnrom
features:
- name: text
dtype: string
splits:
- name: valid
num_bytes: 80652
num_examples: 1001
- name: test
num_bytes: 79698
num_examples: 1001
download_size: 118502
dataset_size: 160350
- config_name: lcsalign-en
features:
- name: text
dtype: string
splits:
- name: valid
num_bytes: 76019
num_examples: 1001
- name: test
num_bytes: 75359
num_examples: 1001
download_size: 113901
dataset_size: 151378
configs:
- config_name: lcsalign-bn
data_files:
- split: valid
path: lcsalign-bn/valid-*
- split: test
path: lcsalign-bn/test-*
- config_name: lcsalign-bnrom
data_files:
- split: valid
path: lcsalign-bnrom/valid-*
- split: test
path: lcsalign-bnrom/test-*
- config_name: lcsalign-en
data_files:
- split: valid
path: lcsalign-en/valid-*
- split: test
path: lcsalign-en/test-*
---
|
adamjweintraut/bart-finetuned-lyrlen-512_2024-03-19_run | ---
dataset_info:
features:
- name: id
dtype: int64
- name: syllable_counts
dtype: string
- name: predicted
dtype: string
- name: label
dtype: string
- name: rougeL_min_precision
dtype: float64
- name: rougeL_min_recall
dtype: float64
- name: rougeL_min_fmeasure
dtype: float64
- name: rougeL_median_precision
dtype: float64
- name: rougeL_median_recall
dtype: float64
- name: rougeL_median_fmeasure
dtype: float64
- name: rougeL_max_precision
dtype: float64
- name: rougeL_max_recall
dtype: float64
- name: rougeL_max_fmeasure
dtype: float64
- name: predicted_label_sim
dtype: float32
splits:
- name: train
num_bytes: 61731
num_examples: 20
download_size: 58261
dataset_size: 61731
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_80_1713222809 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 181394
num_examples: 423
download_size: 94038
dataset_size: 181394
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TallalUsman/office-llm | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 1876884.0
num_examples: 229
- name: test
num_bytes: 213096.0
num_examples: 26
download_size: 991661
dataset_size: 2089980.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "office-llm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qazisaad/llama_2_product_titles-esci_train-temp | ---
dataset_info:
features:
- name: index
dtype: int64
- name: query
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: preds
dtype: string
splits:
- name: train
num_bytes: 2985474
num_examples: 1564
download_size: 325190
dataset_size: 2985474
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2_product_titles-esci_train-temp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_220 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 21843332208.375
num_examples: 227421
download_size: 19732408435
dataset_size: 21843332208.375
---
# Dataset Card for "chunk_220"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jasshl/bedroom | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 19707144.0
num_examples: 315
download_size: 18372623
dataset_size: 19707144.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
semeru/code-code-DefectDetection | ---
license: mit
Programminglanguage: "C"
version: "N/A"
Date: "Devign(Jun 2019 - paper release date)"
Contaminated: "Very Likely"
Size: "Standard Tokenizer"
---
### Dataset is imported from CodeXGLUE and pre-processed using their script.
# Where to find in Semeru:
The dataset can be found at /nfs/semeru/semeru_datasets/code_xglue/code-to-code/Defect-detection in Semeru
# CodeXGLUE -- Defect Detection
## Task Definition
Given a source code, the task is to identify whether it is an insecure code that may attack software systems, such as resource leaks, use-after-free vulnerabilities and DoS attack. We treat the task as binary classification (0/1), where 1 stands for insecure code and 0 for secure code.
### Dataset
The dataset we use comes from the paper [*Devign*: Effective Vulnerability Identification by Learning Comprehensive Program Semantics via Graph Neural Networks](http://papers.nips.cc/paper/9209-devign-effective-vulnerability-identification-by-learning-comprehensive-program-semantics-via-graph-neural-networks.pdf). We combine all projects and split 80%/10%/10% for training/dev/test.
### Data Format
Three pre-processed .jsonl files, i.e. train.jsonl, valid.jsonl, test.jsonl are present
For each file, each line in the uncompressed file represents one function. One row is illustrated below.
- **func:** the source code
- **target:** 0 or 1 (vulnerability or not)
- **idx:** the index of example
### Data Statistics
Data statistics of the dataset are shown in the below table:
| | #Examples |
| ----- | :-------: |
| Train | 21,854 |
| Dev | 2,732 |
| Test | 2,732 |
## Reference
<pre><code>@inproceedings{zhou2019devign,
title={Devign: Effective vulnerability identification by learning comprehensive program semantics via graph neural networks},
author={Zhou, Yaqin and Liu, Shangqing and Siow, Jingkai and Du, Xiaoning and Liu, Yang},
booktitle={Advances in Neural Information Processing Systems},
pages={10197--10207},
year={2019}
}</code></pre>
|
thevox/en-nb-15k | ---
license: mpl-2.0
task_categories:
- translation
language:
- en
- nb
- 'no'
pretty_name: English-Norwegian Translation
size_categories:
- 10K<n<100K
---
## Methadology
Used GPT-3.5 with a translation prompt to give two versions of translation in Norwegian from English - normal and "more natural". Context is also generated.
## Data
Input sample format:
```
Oversett til Norsk:
{text}
```
Each output sample is formatted like this:
```
Kontekst: {a, b, c}
Oversettelse:
{translation}
Mer naturlig:
{improved_translation}
```
## Future work
Dataset will be used to train LLM-based translation models based on LLama 2 and similar to try to rival DeepL and ChatGPT machine translation.
## Dataset used
Original english text from: [nampdn-ai/tiny-textbooks](https://huggingface.co/datasets/nampdn-ai/tiny-textbooks)
### Author Contaxt
jonaslsa@uio.no |
stanmalkinson199/CartmanClassico | ---
license: openrail
---
|
RikoteMaster/isear_for_llama2_v3 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: Text_processed
dtype: string
- name: Emotion
dtype: string
- name: Augmented
dtype: bool
- name: text
dtype: string
splits:
- name: train
num_bytes: 4880496
num_examples: 8823
- name: test
num_bytes: 974743
num_examples: 1879
download_size: 2117583
dataset_size: 5855239
---
# Dataset Card for "isear_for_llama2_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
russellbal/dictionary-openchat-3.5-0106 | ---
license: wtfpl
---
To watch a video on how this dataset was created, watch the following videos:
Are words free?:
* https://youtu.be/Utg_D-yQB_E?si=FKp_QZ4PbKesiDrn
Replacing Chatgpt 3.5 turbo workflows with Openchat:
* https://youtu.be/DNKepnKuZns?si=bleufaiGdwGdrueK
|
BeIR/hotpotqa-qrels | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
- zero-shot-retrieval
- information-retrieval
- zero-shot-information-retrieval
task_ids:
- passage-retrieval
- entity-linking-retrieval
- fact-checking-retrieval
- tweet-retrieval
- citation-prediction-retrieval
- duplication-question-retrieval
- argument-retrieval
- news-retrieval
- biomedical-information-retrieval
- question-answering-retrieval
---
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
omerm/test_dataset | ---
license: apache-2.0
---
|
SKyu/2019_2020data | ---
license: cc-by-nc-sa-4.0
language:
- en
- ko
pretty_name: architecture data set 2019-2020
size_categories:
- 10K<n<100K
--- |
JamesStratford/voice-of-birds | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': Andean Guan
'1': Andean Tinamou
'2': Australian Brushturkey
'3': Band-tailed Guan
'4': Barred Tinamou
'5': Bartletts Tinamou
'6': Baudo Guan
'7': Bearded Guan
'8': Berlepschs Tinamou
'9': Biak Scrubfowl
'10': Black Tinamou
'11': Black-billed Brushturkey
'12': Black-capped Tinamou
'13': Black-fronted Piping Guan
'14': Blue-throated Piping Guan
'15': Brazilian Tinamou
'16': Brown Tinamou
'17': Brushland Tinamou
'18': Buff-browed Chachalaca
'19': Cauca Guan
'20': Chaco Chachalaca
'21': Chestnut-bellied Guan
'22': Chestnut-headed Chachalaca
'23': Chestnut-winged Chachalaca
'24': Chilean Tinamou
'25': Choco Tinamou
'26': Cinereous Tinamou
'27': Collared Brushturkey
'28': Colombian Chachalaca
'29': Common Ostrich
'30': Crested Guan
'31': Curve-billed Tinamou
'32': Darwins Nothura
'33': Dusky Megapode
'34': Dusky-legged Guan
'35': Dwarf Cassowary
'36': Dwarf Tinamou
'37': East Brazilian Chachalaca
'38': Elegant Crested Tinamou
'39': Emu
'40': Great Spotted Kiwi
'41': Great Tinamou
'42': Greater Rhea
'43': Grey Tinamou
'44': Grey-headed Chachalaca
'45': Grey-legged Tinamou
'46': Highland Tinamou
'47': Hooded Tinamou
'48': Huayco Tinamou
'49': Lesser Nothura
'50': Lesser Rhea
'51': Little Chachalaca
'52': Little Spotted Kiwi
'53': Little Tinamou
'54': Maleo
'55': Malleefowl
'56': Marail Guan
'57': Melanesian Megapode
'58': Micronesian Megapode
'59': Moluccan Megapode
'60': New Guinea Scrubfowl
'61': Nicobar Megapode
'62': North Island Brown Kiwi
'63': Northern Cassowary
'64': Okarito Kiwi
'65': Orange-footed Scrubfowl
'66': Ornate Tinamou
'67': Pale-browed Tinamou
'68': Patagonian Tinamou
'69': Philippine Megapode
'70': Plain Chachalaca
'71': Puna Tinamou
'72': Quebracho Crested Tinamou
'73': Red-billed Brushturkey
'74': Red-faced Guan
'75': Red-legged Tinamou
'76': Red-throated Piping Guan
'77': Red-winged Tinamou
'78': Rufous-bellied Chachalaca
'79': Rufous-headed Chachalaca
'80': Rufous-vented Chachalaca
'81': Rusty Tinamou
'82': Rusty-margined Guan
'83': Scaled Chachalaca
'84': Slaty-breasted Tinamou
'85': Small-billed Tinamou
'86': Solitary Tinamou
'87': Somali Ostrich
'88': Southern Brown Kiwi
'89': Southern Cassowary
'90': Speckled Chachalaca
'91': Spixs Guan
'92': Spotted Nothura
'93': Sula Megapode
'94': Taczanowskis Tinamou
'95': Tanimbar Megapode
'96': Tataupa Tinamou
'97': Tawny-breasted Tinamou
'98': Tepui Tinamou
'99': Thicket Tinamou
'100': Tongan Megapode
'101': Trinidad Piping Guan
'102': Undulated Tinamou
'103': Vanuatu Megapode
'104': Variegated Tinamou
'105': Wattled Brushturkey
'106': West Mexican Chachalaca
'107': White-bellied Chachalaca
'108': White-bellied Nothura
'109': White-browed Guan
'110': White-crested Guan
'111': White-throated Tinamou
'112': White-winged Guan
'113': Yellow-legged Tinamou
splits:
- name: train
num_bytes: 4195346823.345
num_examples: 1723
- name: test
num_bytes: 1127981706.0
num_examples: 431
download_size: 3288384913
dataset_size: 5323328529.344999
---
# Dataset Card for "voice-of-birds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/sv_98_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of sv_98/SV-98/SV-98 (Girls' Frontline)
This is the dataset of sv_98/SV-98/SV-98 (Girls' Frontline), containing 138 images and their tags.
The core tags of this character are `blonde_hair, long_hair, red_eyes, ponytail, hairband, bow, hair_bow, green_hairband, breasts, green_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 138 | 213.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sv_98_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 138 | 102.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sv_98_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 346 | 229.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sv_98_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 138 | 178.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sv_98_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 346 | 354.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sv_98_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/sv_98_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, solo, mod3_(girls'_frontline), white_jacket, black_gloves, white_scarf, artist_name, black_bodysuit, closed_mouth, smile, looking_at_viewer, white_background, multicolored_gloves, bolt_action, holding, sniper_rifle, black_thighhighs, simple_background, turtleneck, blush |
| 1 | 6 |  |  |  |  |  | 1girl, artist_name, blush, bangs, closed_mouth, looking_at_viewer, nipples, nude, smile, solo, mod3_(girls'_frontline), simple_background, standing, wedding_ring, white_background, medium_breasts, navel, tattoo |
| 2 | 9 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, penis, solo_focus, blush, completely_nude, pussy, sex, vaginal, artist_name, navel, pov, looking_at_viewer, large_breasts, open_mouth, bar_censor, cowgirl_position, girl_on_top, lying, mod3_(girls'_frontline), wedding_ring |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | mod3_(girls'_frontline) | white_jacket | black_gloves | white_scarf | artist_name | black_bodysuit | closed_mouth | smile | looking_at_viewer | white_background | multicolored_gloves | bolt_action | holding | sniper_rifle | black_thighhighs | simple_background | turtleneck | blush | bangs | nipples | nude | standing | wedding_ring | medium_breasts | navel | tattoo | 1boy | hetero | penis | solo_focus | completely_nude | pussy | sex | vaginal | pov | large_breasts | open_mouth | bar_censor | cowgirl_position | girl_on_top | lying |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------------|:---------------|:---------------|:--------------|:--------------|:-----------------|:---------------|:--------|:--------------------|:-------------------|:----------------------|:--------------|:----------|:---------------|:-------------------|:--------------------|:-------------|:--------|:--------|:----------|:-------|:-----------|:---------------|:-----------------|:--------|:---------|:-------|:---------|:--------|:-------------|:------------------|:--------|:------|:----------|:------|:----------------|:-------------|:-------------|:-------------------|:--------------|:--------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | | | | X | | X | X | X | X | | | | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | X | | | | X | | | | X | | | | | | | | | X | | X | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
ID3/comentarios_youtube_validos_timestamp | ---
dataset_info:
features:
- name: comentario
dtype: string
- name: likes
dtype: int64
- name: similitud
dtype: float64
- name: timeStamp
dtype: int64
- name: id
dtype: string
splits:
- name: train
num_bytes: 1347724
num_examples: 8181
download_size: 0
dataset_size: 1347724
---
# Dataset Card for "comentarios_youtube_validos_timestamp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
soddokayo/crime-1 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 12070
num_examples: 180
download_size: 6185
dataset_size: 12070
---
# Dataset Card for "crime-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kimnt93/zh-sharegpt | ---
dataset_info:
features:
- name: role
dtype: string
- name: value
dtype: string
- name: seq
dtype: int64
- name: conv
dtype: string
splits:
- name: train
num_bytes: 85394222
num_examples: 163760
download_size: 43860934
dataset_size: 85394222
---
# Dataset Card for "zh-sharegpt"
[zh ShareGPT](https://huggingface.co/datasets/philschmid/sharegpt-raw) |
open-llm-leaderboard/details_speechlessai__speechless-codellama-dolphin-orca-platypus-13b | ---
pretty_name: Evaluation run of speechlessai/speechless-codellama-dolphin-orca-platypus-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [speechlessai/speechless-codellama-dolphin-orca-platypus-13b](https://huggingface.co/speechlessai/speechless-codellama-dolphin-orca-platypus-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_speechlessai__speechless-codellama-dolphin-orca-platypus-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T03:55:53.446205](https://huggingface.co/datasets/open-llm-leaderboard/details_speechlessai__speechless-codellama-dolphin-orca-platypus-13b/blob/main/results_2023-10-24T03-55-53.446205.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.28366191275167785,\n\
\ \"em_stderr\": 0.004616354866148243,\n \"f1\": 0.3476940016778539,\n\
\ \"f1_stderr\": 0.004573065876077725,\n \"acc\": 0.36921252317982634,\n\
\ \"acc_stderr\": 0.010525993118068228\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.28366191275167785,\n \"em_stderr\": 0.004616354866148243,\n\
\ \"f1\": 0.3476940016778539,\n \"f1_stderr\": 0.004573065876077725\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08491281273692192,\n \
\ \"acc_stderr\": 0.0076782128244508\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.6535122336227308,\n \"acc_stderr\": 0.013373773411685655\n\
\ }\n}\n```"
repo_url: https://huggingface.co/speechlessai/speechless-codellama-dolphin-orca-platypus-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|arc:challenge|25_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T03_55_53.446205
path:
- '**/details_harness|drop|3_2023-10-24T03-55-53.446205.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T03-55-53.446205.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T03_55_53.446205
path:
- '**/details_harness|gsm8k|5_2023-10-24T03-55-53.446205.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T03-55-53.446205.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hellaswag|10_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-05-20.709991.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T12-05-20.709991.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T12-05-20.709991.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T03_55_53.446205
path:
- '**/details_harness|winogrande|5_2023-10-24T03-55-53.446205.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T03-55-53.446205.parquet'
- config_name: results
data_files:
- split: 2023_09_13T12_05_20.709991
path:
- results_2023-09-13T12-05-20.709991.parquet
- split: 2023_10_24T03_55_53.446205
path:
- results_2023-10-24T03-55-53.446205.parquet
- split: latest
path:
- results_2023-10-24T03-55-53.446205.parquet
---
# Dataset Card for Evaluation run of speechlessai/speechless-codellama-dolphin-orca-platypus-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/speechlessai/speechless-codellama-dolphin-orca-platypus-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [speechlessai/speechless-codellama-dolphin-orca-platypus-13b](https://huggingface.co/speechlessai/speechless-codellama-dolphin-orca-platypus-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_speechlessai__speechless-codellama-dolphin-orca-platypus-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T03:55:53.446205](https://huggingface.co/datasets/open-llm-leaderboard/details_speechlessai__speechless-codellama-dolphin-orca-platypus-13b/blob/main/results_2023-10-24T03-55-53.446205.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.28366191275167785,
"em_stderr": 0.004616354866148243,
"f1": 0.3476940016778539,
"f1_stderr": 0.004573065876077725,
"acc": 0.36921252317982634,
"acc_stderr": 0.010525993118068228
},
"harness|drop|3": {
"em": 0.28366191275167785,
"em_stderr": 0.004616354866148243,
"f1": 0.3476940016778539,
"f1_stderr": 0.004573065876077725
},
"harness|gsm8k|5": {
"acc": 0.08491281273692192,
"acc_stderr": 0.0076782128244508
},
"harness|winogrande|5": {
"acc": 0.6535122336227308,
"acc_stderr": 0.013373773411685655
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
alvations/c4p0-fr-en | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: target_backto_source
dtype: string
- name: raw_target
list:
- name: generated_text
dtype: string
- name: raw_target_backto_source
list:
- name: generated_text
dtype: string
- name: prompt
dtype: string
- name: reverse_prompt
dtype: string
- name: source_langid
dtype: string
- name: target_langid
dtype: string
- name: target_backto_source_langid
dtype: string
- name: doc_id
dtype: int64
- name: sent_id
dtype: int64
- name: timestamp
dtype: timestamp[us]
- name: url
dtype: string
- name: doc_hash
dtype: string
splits:
- name: train
num_bytes: 4749
num_examples: 3
download_size: 19021
dataset_size: 4749
configs:
- config_name: default
data_files:
- split: train
path: f52863d73b78efef/train-*
---
|
JohnnyWilardson/SCL | ---
license: afl-3.0
---
|
Databasesprojec/FinStmts_ConsUncons_French_Predict_part2 | ---
dataset_info:
features:
- name: label
dtype: int64
- name: id
dtype: string
- name: text
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 4096082127
num_examples: 9769
download_size: 1904994350
dataset_size: 4096082127
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
chrominancedesign/hf | ---
license: apache-2.0
---
|
polejowska/cd45rb_test | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
list:
- name: category_id
dtype:
class_label:
names:
'0': leukocyte
- name: image_id
dtype: string
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: segmentation
list:
list: float32
- name: iscrowd
dtype: bool
splits:
- name: test
num_bytes: 4074586864.944
num_examples: 2116
download_size: 4077802300
dataset_size: 4074586864.944
---
# Dataset Card for "cd45rb_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SocialGrep/the-antiwork-subreddit-dataset | ---
annotations_creators:
- lexyr
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
source_datasets:
- original
paperswithcode_id: null
---
# Dataset Card for the-antiwork-subreddit-dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://socialgrep.com/datasets](https://socialgrep.com/datasets/the-antiwork-subreddit-dataset?utm_source=huggingface&utm_medium=link&utm_campaign=theantiworksubredditdataset)
- **Point of Contact:** [Website](https://socialgrep.com/contact?utm_source=huggingface&utm_medium=link&utm_campaign=theantiworksubredditdataset)
### Dataset Summary
This corpus contains the complete data for the activity of the /r/Antiwork subreddit until 2022-02-18.
### Languages
Mainly English.
## Dataset Structure
### Data Instances
A data point is a post or a comment. Due to the separate nature of the two, those exist in two different files - even though many fields are shared.
### Data Fields
- 'type': the type of the data point. Can be 'post' or 'comment'.
- 'id': the base-36 Reddit ID of the data point. Unique when combined with type.
- 'subreddit.id': the base-36 Reddit ID of the data point's host subreddit. Unique.
- 'subreddit.name': the human-readable name of the data point's host subreddit.
- 'subreddit.nsfw': a boolean marking the data point's host subreddit as NSFW or not.
- 'created_utc': a UTC timestamp for the data point.
- 'permalink': a reference link to the data point on Reddit.
- 'score': score of the data point on Reddit.
- 'domain': (Post only) the domain of the data point's link.
- 'url': (Post only) the destination of the data point's link, if any.
- 'selftext': (Post only) the self-text of the data point, if any.
- 'title': (Post only) the title of the post data point.
- 'body': (Comment only) the body of the comment data point.
- 'sentiment': (Comment only) the result of an in-house sentiment analysis pipeline. Used for exploratory analysis.
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
CC-BY v4.0
### Contributions
[Needs More Information] |
WStark/dataset | ---
license: mit
---
|
LambdaTests/VQAv2_sample_validation_benchmarks_partition_global_14_loca_6 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 15
num_examples: 1
download_size: 0
dataset_size: 15
---
# Dataset Card for "VQAv2_sample_validation_benchmarks_partition_global_14_loca_6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Weni/Dataset_semantic_alignment_translation_en-es-direction_en-pt_br-direction | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: string
dtype: string
- name: string_translation
dtype: string
splits:
- name: train
num_bytes: 9521461
num_examples: 40001
download_size: 3814409
dataset_size: 9521461
---
# Dataset Card for "Dataset_semantic_alignment_translation_en-es-direction_en-pt_br-direction"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
saibo/bookcorpus_compact_1024_test_meta | ---
dataset_info:
features:
- name: text
dtype: string
- name: concept_with_offset
dtype: string
- name: cid_arrangement
sequence: int32
- name: schema_lengths
sequence: int64
- name: topic_entity_mask
sequence: int64
- name: text_lengths
sequence: int64
splits:
- name: train
num_bytes: 758527093
num_examples: 6160
download_size: 169143634
dataset_size: 758527093
---
# Dataset Card for "bookcorpus_compact_1024_test_meta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fathyshalab/reklamation24_oeffentlichkeit-soziales-intent | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 84032
num_examples: 153
- name: test
num_bytes: 19855
num_examples: 39
download_size: 62925
dataset_size: 103887
---
# Dataset Card for "reklamation24_oeffentlichkeit-soziales-intent"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.12 | ---
pretty_name: Evaluation run of SF-Foundation/Ein-72B-v0.12
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SF-Foundation/Ein-72B-v0.12](https://huggingface.co/SF-Foundation/Ein-72B-v0.12)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.12\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-13T11:06:19.237402](https://huggingface.co/datasets/open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.12/blob/main/results_2024-02-13T11-06-19.237402.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7720004576068558,\n\
\ \"acc_stderr\": 0.028018920061937066,\n \"acc_norm\": 0.77366212968727,\n\
\ \"acc_norm_stderr\": 0.028576972189266775,\n \"mc1\": 0.6597307221542228,\n\
\ \"mc1_stderr\": 0.016586304901762553,\n \"mc2\": 0.7778465654225306,\n\
\ \"mc2_stderr\": 0.013819882710780051\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7406143344709898,\n \"acc_stderr\": 0.01280827357392709,\n\
\ \"acc_norm\": 0.7619453924914675,\n \"acc_norm_stderr\": 0.0124457700280262\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7251543517227644,\n\
\ \"acc_stderr\": 0.004455240755811573,\n \"acc_norm\": 0.8946425014937264,\n\
\ \"acc_norm_stderr\": 0.003063860621772738\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n\
\ \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n\
\ \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474928,\n\
\ \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474928\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.82,\n\
\ \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8377358490566038,\n \"acc_stderr\": 0.02269148287203535,\n\
\ \"acc_norm\": 0.8377358490566038,\n \"acc_norm_stderr\": 0.02269148287203535\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9375,\n\
\ \"acc_stderr\": 0.02024219611347799,\n \"acc_norm\": 0.9375,\n \
\ \"acc_norm_stderr\": 0.02024219611347799\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.026355158413349417,\n\
\ \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.026355158413349417\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7793103448275862,\n \"acc_stderr\": 0.03455930201924811,\n\
\ \"acc_norm\": 0.7793103448275862,\n \"acc_norm_stderr\": 0.03455930201924811\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6851851851851852,\n \"acc_stderr\": 0.023919984164047732,\n \"\
acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.023919984164047732\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8870967741935484,\n \"acc_stderr\": 0.0180036033258636,\n \"acc_norm\"\
: 0.8870967741935484,\n \"acc_norm_stderr\": 0.0180036033258636\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6650246305418719,\n\
\ \"acc_stderr\": 0.033208527423483104,\n \"acc_norm\": 0.6650246305418719,\n\
\ \"acc_norm_stderr\": 0.033208527423483104\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \
\ \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n\
\ \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9393939393939394,\n \"acc_stderr\": 0.016999994927421592,\n \"\
acc_norm\": 0.9393939393939394,\n \"acc_norm_stderr\": 0.016999994927421592\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084315,\n\
\ \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084315\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8102564102564103,\n \"acc_stderr\": 0.01988016540658877,\n \
\ \"acc_norm\": 0.8102564102564103,\n \"acc_norm_stderr\": 0.01988016540658877\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.45555555555555555,\n \"acc_stderr\": 0.03036486250482443,\n \
\ \"acc_norm\": 0.45555555555555555,\n \"acc_norm_stderr\": 0.03036486250482443\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n\
\ \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5562913907284768,\n \"acc_stderr\": 0.04056527902281732,\n \"\
acc_norm\": 0.5562913907284768,\n \"acc_norm_stderr\": 0.04056527902281732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9357798165137615,\n \"acc_stderr\": 0.010510494713201403,\n \"\
acc_norm\": 0.9357798165137615,\n \"acc_norm_stderr\": 0.010510494713201403\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6898148148148148,\n \"acc_stderr\": 0.03154696285656627,\n \"\
acc_norm\": 0.6898148148148148,\n \"acc_norm_stderr\": 0.03154696285656627\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \
\ \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.02871877688934232,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.02871877688934232\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540616,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540616\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n\
\ \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n\
\ \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n\
\ \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.0349260647662379,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.0349260647662379\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n\
\ \"acc_stderr\": 0.015006312806446914,\n \"acc_norm\": 0.9444444444444444,\n\
\ \"acc_norm_stderr\": 0.015006312806446914\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977725,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977725\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9195402298850575,\n\
\ \"acc_stderr\": 0.009726831316141866,\n \"acc_norm\": 0.9195402298850575,\n\
\ \"acc_norm_stderr\": 0.009726831316141866\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8410404624277457,\n \"acc_stderr\": 0.019685307033571946,\n\
\ \"acc_norm\": 0.8410404624277457,\n \"acc_norm_stderr\": 0.019685307033571946\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6960893854748603,\n\
\ \"acc_stderr\": 0.015382845587584517,\n \"acc_norm\": 0.6960893854748603,\n\
\ \"acc_norm_stderr\": 0.015382845587584517\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.02046417512433263,\n\
\ \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.02046417512433263\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8456591639871383,\n\
\ \"acc_stderr\": 0.02051905034208471,\n \"acc_norm\": 0.8456591639871383,\n\
\ \"acc_norm_stderr\": 0.02051905034208471\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8641975308641975,\n \"acc_stderr\": 0.019061588181505405,\n\
\ \"acc_norm\": 0.8641975308641975,\n \"acc_norm_stderr\": 0.019061588181505405\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6631205673758865,\n \"acc_stderr\": 0.02819553487396673,\n \
\ \"acc_norm\": 0.6631205673758865,\n \"acc_norm_stderr\": 0.02819553487396673\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6082138200782269,\n\
\ \"acc_stderr\": 0.012467564418145118,\n \"acc_norm\": 0.6082138200782269,\n\
\ \"acc_norm_stderr\": 0.012467564418145118\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.022368672562886747,\n\
\ \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.022368672562886747\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273337,\n \
\ \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273337\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8326530612244898,\n \"acc_stderr\": 0.02389714476891452,\n\
\ \"acc_norm\": 0.8326530612244898,\n \"acc_norm_stderr\": 0.02389714476891452\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9054726368159204,\n\
\ \"acc_stderr\": 0.020687186951534094,\n \"acc_norm\": 0.9054726368159204,\n\
\ \"acc_norm_stderr\": 0.020687186951534094\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276894,\n\
\ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276894\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6597307221542228,\n\
\ \"mc1_stderr\": 0.016586304901762553,\n \"mc2\": 0.7778465654225306,\n\
\ \"mc2_stderr\": 0.013819882710780051\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8445146014206788,\n \"acc_stderr\": 0.010184308214775778\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7922668688400303,\n \
\ \"acc_stderr\": 0.011174572716705886\n }\n}\n```"
repo_url: https://huggingface.co/SF-Foundation/Ein-72B-v0.12
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|arc:challenge|25_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|gsm8k|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hellaswag|10_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T11-06-19.237402.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T11-06-19.237402.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- '**/details_harness|winogrande|5_2024-02-13T11-06-19.237402.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-13T11-06-19.237402.parquet'
- config_name: results
data_files:
- split: 2024_02_13T11_06_19.237402
path:
- results_2024-02-13T11-06-19.237402.parquet
- split: latest
path:
- results_2024-02-13T11-06-19.237402.parquet
---
# Dataset Card for Evaluation run of SF-Foundation/Ein-72B-v0.12
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SF-Foundation/Ein-72B-v0.12](https://huggingface.co/SF-Foundation/Ein-72B-v0.12) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.12",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T11:06:19.237402](https://huggingface.co/datasets/open-llm-leaderboard/details_SF-Foundation__Ein-72B-v0.12/blob/main/results_2024-02-13T11-06-19.237402.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7720004576068558,
"acc_stderr": 0.028018920061937066,
"acc_norm": 0.77366212968727,
"acc_norm_stderr": 0.028576972189266775,
"mc1": 0.6597307221542228,
"mc1_stderr": 0.016586304901762553,
"mc2": 0.7778465654225306,
"mc2_stderr": 0.013819882710780051
},
"harness|arc:challenge|25": {
"acc": 0.7406143344709898,
"acc_stderr": 0.01280827357392709,
"acc_norm": 0.7619453924914675,
"acc_norm_stderr": 0.0124457700280262
},
"harness|hellaswag|10": {
"acc": 0.7251543517227644,
"acc_stderr": 0.004455240755811573,
"acc_norm": 0.8946425014937264,
"acc_norm_stderr": 0.003063860621772738
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474928,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8377358490566038,
"acc_stderr": 0.02269148287203535,
"acc_norm": 0.8377358490566038,
"acc_norm_stderr": 0.02269148287203535
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9375,
"acc_stderr": 0.02024219611347799,
"acc_norm": 0.9375,
"acc_norm_stderr": 0.02024219611347799
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.026355158413349417,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.026355158413349417
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7793103448275862,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.7793103448275862,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8870967741935484,
"acc_stderr": 0.0180036033258636,
"acc_norm": 0.8870967741935484,
"acc_norm_stderr": 0.0180036033258636
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6650246305418719,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.6650246305418719,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9393939393939394,
"acc_stderr": 0.016999994927421592,
"acc_norm": 0.9393939393939394,
"acc_norm_stderr": 0.016999994927421592
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9844559585492227,
"acc_stderr": 0.008927492715084315,
"acc_norm": 0.9844559585492227,
"acc_norm_stderr": 0.008927492715084315
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8102564102564103,
"acc_stderr": 0.01988016540658877,
"acc_norm": 0.8102564102564103,
"acc_norm_stderr": 0.01988016540658877
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45555555555555555,
"acc_stderr": 0.03036486250482443,
"acc_norm": 0.45555555555555555,
"acc_norm_stderr": 0.03036486250482443
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5562913907284768,
"acc_stderr": 0.04056527902281732,
"acc_norm": 0.5562913907284768,
"acc_norm_stderr": 0.04056527902281732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9357798165137615,
"acc_stderr": 0.010510494713201403,
"acc_norm": 0.9357798165137615,
"acc_norm_stderr": 0.010510494713201403
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6898148148148148,
"acc_stderr": 0.03154696285656627,
"acc_norm": 0.6898148148148148,
"acc_norm_stderr": 0.03154696285656627
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.02871877688934232,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.02871877688934232
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540616,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540616
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.0349260647662379,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.0349260647662379
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446914,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446914
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977725,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977725
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9195402298850575,
"acc_stderr": 0.009726831316141866,
"acc_norm": 0.9195402298850575,
"acc_norm_stderr": 0.009726831316141866
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8410404624277457,
"acc_stderr": 0.019685307033571946,
"acc_norm": 0.8410404624277457,
"acc_norm_stderr": 0.019685307033571946
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6960893854748603,
"acc_stderr": 0.015382845587584517,
"acc_norm": 0.6960893854748603,
"acc_norm_stderr": 0.015382845587584517
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8496732026143791,
"acc_stderr": 0.02046417512433263,
"acc_norm": 0.8496732026143791,
"acc_norm_stderr": 0.02046417512433263
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8456591639871383,
"acc_stderr": 0.02051905034208471,
"acc_norm": 0.8456591639871383,
"acc_norm_stderr": 0.02051905034208471
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8641975308641975,
"acc_stderr": 0.019061588181505405,
"acc_norm": 0.8641975308641975,
"acc_norm_stderr": 0.019061588181505405
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6631205673758865,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.6631205673758865,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6082138200782269,
"acc_stderr": 0.012467564418145118,
"acc_norm": 0.6082138200782269,
"acc_norm_stderr": 0.012467564418145118
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.022368672562886747,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.022368672562886747
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273337,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273337
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8326530612244898,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.8326530612244898,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9054726368159204,
"acc_stderr": 0.020687186951534094,
"acc_norm": 0.9054726368159204,
"acc_norm_stderr": 0.020687186951534094
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276894,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276894
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6597307221542228,
"mc1_stderr": 0.016586304901762553,
"mc2": 0.7778465654225306,
"mc2_stderr": 0.013819882710780051
},
"harness|winogrande|5": {
"acc": 0.8445146014206788,
"acc_stderr": 0.010184308214775778
},
"harness|gsm8k|5": {
"acc": 0.7922668688400303,
"acc_stderr": 0.011174572716705886
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
distilled-one-sec-cv12-each-chunk-uniq/chunk_192 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1337199052.0
num_examples: 260561
download_size: 1368468682
dataset_size: 1337199052.0
---
# Dataset Card for "chunk_192"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Capsekai/Misc_Datasets | ---
license: creativeml-openrail-m
task_categories:
- text-to-image
language:
- en
tags:
- stable diffusion
- untagged dataset
- photography
- anime
- art
pretty_name: Dataset Dump Untagged
size_categories:
- 1K<n<10K
---
Everything in here should be under CreativeML Open Rail.
We hope that you enjoy the content in here.
We're not at risk for anything you do with it.
Go on, use it! |
martinakaduc/hh-rlhf-llama2-7b-embedding | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: chosen
sequence: float64
- name: rejected
sequence: float64
splits:
- name: train
num_bytes: 10539475200
num_examples: 160800
- name: test
num_bytes: 560532288
num_examples: 8552
download_size: 6413844185
dataset_size: 11100007488
language:
- en
--- |
nikniksen/TMJIT | ---
dataset_info:
features:
- name: example
dtype: string
splits:
- name: train
num_bytes: 945
num_examples: 1
- name: test
num_bytes: 7940
num_examples: 9
download_size: 16158
dataset_size: 8885
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
JuanJoseMV/MIMMICIII-tokenized_notes_train | ---
dataset_info:
features:
- name: HADMID
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 3309959072
num_examples: 1284922
download_size: 114544651
dataset_size: 3309959072
---
# Dataset Card for "MIMMICIII-tokenized_notes_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_psmathur__model_009 | ---
pretty_name: Evaluation run of psmathur/model_009
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [psmathur/model_009](https://huggingface.co/psmathur/model_009) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__model_009_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-09T07:41:27.734814](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_009_public/blob/main/results_2023-11-09T07-41-27.734814.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3341023489932886,\n\
\ \"em_stderr\": 0.004830400685277283,\n \"f1\": 0.440147860738256,\n\
\ \"f1_stderr\": 0.0045184970708564655,\n \"acc\": 0.6087212395126058,\n\
\ \"acc_stderr\": 0.0120913878225072\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3341023489932886,\n \"em_stderr\": 0.004830400685277283,\n\
\ \"f1\": 0.440147860738256,\n \"f1_stderr\": 0.0045184970708564655\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39423805913570886,\n \
\ \"acc_stderr\": 0.01346085235709565\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918747\n\
\ }\n}\n```"
repo_url: https://huggingface.co/psmathur/model_009
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_09T07_41_27.734814
path:
- '**/details_harness|drop|3_2023-11-09T07-41-27.734814.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-09T07-41-27.734814.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_09T07_41_27.734814
path:
- '**/details_harness|gsm8k|5_2023-11-09T07-41-27.734814.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-09T07-41-27.734814.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_09T07_41_27.734814
path:
- '**/details_harness|winogrande|5_2023-11-09T07-41-27.734814.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-09T07-41-27.734814.parquet'
- config_name: results
data_files:
- split: 2023_11_09T07_41_27.734814
path:
- results_2023-11-09T07-41-27.734814.parquet
- split: latest
path:
- results_2023-11-09T07-41-27.734814.parquet
---
# Dataset Card for Evaluation run of psmathur/model_009
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/model_009
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/model_009](https://huggingface.co/psmathur/model_009) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__model_009_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-09T07:41:27.734814](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_009_public/blob/main/results_2023-11-09T07-41-27.734814.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3341023489932886,
"em_stderr": 0.004830400685277283,
"f1": 0.440147860738256,
"f1_stderr": 0.0045184970708564655,
"acc": 0.6087212395126058,
"acc_stderr": 0.0120913878225072
},
"harness|drop|3": {
"em": 0.3341023489932886,
"em_stderr": 0.004830400685277283,
"f1": 0.440147860738256,
"f1_stderr": 0.0045184970708564655
},
"harness|gsm8k|5": {
"acc": 0.39423805913570886,
"acc_stderr": 0.01346085235709565
},
"harness|winogrande|5": {
"acc": 0.8232044198895028,
"acc_stderr": 0.010721923287918747
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
lakong/yt-thumbnails-train | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 259863631.184
num_examples: 2067
download_size: 258196017
dataset_size: 259863631.184
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mjalg/llama-fake-news | ---
license: apache-2.0
---
|
Haneen84/Arabic_satire | ---
license: other
---
|
one-sec-cv12/chunk_141 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 19911614832.375
num_examples: 207309
download_size: 18300991649
dataset_size: 19911614832.375
---
# Dataset Card for "chunk_141"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.