datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
bvallegc/videos | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: video_data
dtype: binary
- name: duration_seconds
dtype: float64
- name: video_path
dtype: string
splits:
- name: train
num_bytes: 3786824395
num_examples: 4688
download_size: 3778922511
dataset_size: 3786824395
---
# Dataset Card for "videos"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hazuki_shizuku_newgame | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Hazuki Shizuku
This is the dataset of Hazuki Shizuku, containing 139 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 139 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 313 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 399 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 139 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 139 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 139 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 313 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 313 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 281 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 399 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 399 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
mlabonne/know_medical_dialogue_v2 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 3039804.3126684637
num_examples: 6290
download_size: 1631953
dataset_size: 3039804.3126684637
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Saraka256/288-demo | ---
license: pddl
---
|
AnaChikashua/handwriting | ---
task_categories:
- image-classification
language:
- ka
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AurumnPegasus/AurumnPegasus | ---
dataset_info:
features:
- name: context
sequence: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 132102296
num_examples: 2649
download_size: 26192269
dataset_size: 132102296
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "AurumnPegasus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NghiemAbe/sts14 | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
splits:
- name: test
num_bytes: 656488
num_examples: 3750
download_size: 323819
dataset_size: 656488
task_categories:
- sentence-similarity
language:
- vi
---
# Dataset Card for "sts14"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marianna13/frontend-instruction-tuning | ---
dataset_info:
features:
- name: __key__
dtype: string
- name: __url__
dtype: string
- name: json
struct:
- name: css
dtype: string
- name: html
dtype: string
- name: png
dtype: image
splits:
- name: train
num_bytes: 65192216.0
num_examples: 996
download_size: 60109313
dataset_size: 65192216.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Elfsong/Bias_in_Bios | ---
dataset_info:
features:
- name: hard_text
dtype: string
- name: profession
dtype: string
- name: gender
dtype: string
splits:
- name: train
num_bytes: 108970597
num_examples: 257478
- name: test
num_bytes: 41882750
num_examples: 99069
- name: dev
num_bytes: 16732695
num_examples: 39642
download_size: 99844255
dataset_size: 167586042
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: dev
path: data/dev-*
---
|
zhangshuoming/c_x86_avx2_extension_filtered_test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 299320.0
num_examples: 1101
download_size: 48467
dataset_size: 299320.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "c_x86_avx2_extension_filtered_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_willnguyen__lacda-2-7B-chat-v0.1 | ---
pretty_name: Evaluation run of willnguyen/lacda-2-7B-chat-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [willnguyen/lacda-2-7B-chat-v0.1](https://huggingface.co/willnguyen/lacda-2-7B-chat-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_willnguyen__lacda-2-7B-chat-v0.1_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-09T13:53:53.211938](https://huggingface.co/datasets/open-llm-leaderboard/details_willnguyen__lacda-2-7B-chat-v0.1_public/blob/main/results_2023-11-09T13-53-53.211938.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46065725811605934,\n\
\ \"acc_stderr\": 0.034477280778802896,\n \"acc_norm\": 0.4668080345369505,\n\
\ \"acc_norm_stderr\": 0.035310968004727446,\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.4456721895962505,\n\
\ \"mc2_stderr\": 0.014265726453599933,\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.0003778609196460794,\n \"f1\": 0.05649014261744978,\n\
\ \"f1_stderr\": 0.0013342363586640303\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4803754266211604,\n \"acc_stderr\": 0.014600132075947087,\n\
\ \"acc_norm\": 0.5307167235494881,\n \"acc_norm_stderr\": 0.014583792546304038\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5796654052977495,\n\
\ \"acc_stderr\": 0.0049260381977145225,\n \"acc_norm\": 0.7757418840868353,\n\
\ \"acc_norm_stderr\": 0.0041624039148053385\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.45660377358490567,\n \"acc_stderr\": 0.030656748696739438,\n\
\ \"acc_norm\": 0.45660377358490567,\n \"acc_norm_stderr\": 0.030656748696739438\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835362,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835362\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2619047619047619,\n \"acc_stderr\": 0.02264421261552521,\n \"\
acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.02264421261552521\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4645161290322581,\n \"acc_stderr\": 0.028372287797962956,\n \"\
acc_norm\": 0.4645161290322581,\n \"acc_norm_stderr\": 0.028372287797962956\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n \"\
acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512566,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512566\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.035623524993954825,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.035623524993954825\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.6062176165803109,\n \"acc_stderr\": 0.035260770955482405,\n\
\ \"acc_norm\": 0.6062176165803109,\n \"acc_norm_stderr\": 0.035260770955482405\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4461538461538462,\n \"acc_stderr\": 0.02520357177302833,\n \
\ \"acc_norm\": 0.4461538461538462,\n \"acc_norm_stderr\": 0.02520357177302833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.46218487394957986,\n \"acc_stderr\": 0.032385469487589795,\n\
\ \"acc_norm\": 0.46218487394957986,\n \"acc_norm_stderr\": 0.032385469487589795\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6165137614678899,\n \"acc_stderr\": 0.020847156641915977,\n \"\
acc_norm\": 0.6165137614678899,\n \"acc_norm_stderr\": 0.020847156641915977\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.23148148148148148,\n \"acc_stderr\": 0.028765111718046955,\n \"\
acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.028765111718046955\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.03509312031717982,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.03509312031717982\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.5780590717299579,\n \"acc_stderr\": 0.032148146302403695,\n\
\ \"acc_norm\": 0.5780590717299579,\n \"acc_norm_stderr\": 0.032148146302403695\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n\
\ \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n\
\ \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.04338920305792401,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.04338920305792401\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968431,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968431\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49079754601226994,\n \"acc_stderr\": 0.039277056007874414,\n\
\ \"acc_norm\": 0.49079754601226994,\n \"acc_norm_stderr\": 0.039277056007874414\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.048467482539772386,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.048467482539772386\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6709401709401709,\n\
\ \"acc_stderr\": 0.030782321577688173,\n \"acc_norm\": 0.6709401709401709,\n\
\ \"acc_norm_stderr\": 0.030782321577688173\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6143039591315453,\n\
\ \"acc_stderr\": 0.017406476619212907,\n \"acc_norm\": 0.6143039591315453,\n\
\ \"acc_norm_stderr\": 0.017406476619212907\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4884393063583815,\n \"acc_stderr\": 0.026911898686377913,\n\
\ \"acc_norm\": 0.4884393063583815,\n \"acc_norm_stderr\": 0.026911898686377913\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4673202614379085,\n \"acc_stderr\": 0.028568699752225875,\n\
\ \"acc_norm\": 0.4673202614379085,\n \"acc_norm_stderr\": 0.028568699752225875\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4783950617283951,\n \"acc_stderr\": 0.027794760105008746,\n\
\ \"acc_norm\": 0.4783950617283951,\n \"acc_norm_stderr\": 0.027794760105008746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3475177304964539,\n \"acc_stderr\": 0.028406627809590954,\n \
\ \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.028406627809590954\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35723598435462844,\n\
\ \"acc_stderr\": 0.012238615750316503,\n \"acc_norm\": 0.35723598435462844,\n\
\ \"acc_norm_stderr\": 0.012238615750316503\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.43137254901960786,\n \"acc_stderr\": 0.020036393768352638,\n \
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.020036393768352638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46122448979591835,\n \"acc_stderr\": 0.03191282052669277,\n\
\ \"acc_norm\": 0.46122448979591835,\n \"acc_norm_stderr\": 0.03191282052669277\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.672514619883041,\n \"acc_stderr\": 0.03599335771456027,\n\
\ \"acc_norm\": 0.672514619883041,\n \"acc_norm_stderr\": 0.03599335771456027\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.016058999026100612,\n \"mc2\": 0.4456721895962505,\n\
\ \"mc2_stderr\": 0.014265726453599933\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972397\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.001363255033557047,\n \
\ \"em_stderr\": 0.0003778609196460794,\n \"f1\": 0.05649014261744978,\n\
\ \"f1_stderr\": 0.0013342363586640303\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.06292645943896892,\n \"acc_stderr\": 0.0066887625815327395\n\
\ }\n}\n```"
repo_url: https://huggingface.co/willnguyen/lacda-2-7B-chat-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|arc:challenge|25_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|drop|3_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|gsm8k|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hellaswag|10_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T13-53-53.211938.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-09T13-53-53.211938.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- '**/details_harness|winogrande|5_2023-11-09T13-53-53.211938.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-09T13-53-53.211938.parquet'
- config_name: results
data_files:
- split: 2023_11_09T13_53_53.211938
path:
- results_2023-11-09T13-53-53.211938.parquet
- split: latest
path:
- results_2023-11-09T13-53-53.211938.parquet
---
# Dataset Card for Evaluation run of willnguyen/lacda-2-7B-chat-v0.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/willnguyen/lacda-2-7B-chat-v0.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [willnguyen/lacda-2-7B-chat-v0.1](https://huggingface.co/willnguyen/lacda-2-7B-chat-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_willnguyen__lacda-2-7B-chat-v0.1_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-09T13:53:53.211938](https://huggingface.co/datasets/open-llm-leaderboard/details_willnguyen__lacda-2-7B-chat-v0.1_public/blob/main/results_2023-11-09T13-53-53.211938.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46065725811605934,
"acc_stderr": 0.034477280778802896,
"acc_norm": 0.4668080345369505,
"acc_norm_stderr": 0.035310968004727446,
"mc1": 0.3011015911872705,
"mc1_stderr": 0.016058999026100612,
"mc2": 0.4456721895962505,
"mc2_stderr": 0.014265726453599933,
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460794,
"f1": 0.05649014261744978,
"f1_stderr": 0.0013342363586640303
},
"harness|arc:challenge|25": {
"acc": 0.4803754266211604,
"acc_stderr": 0.014600132075947087,
"acc_norm": 0.5307167235494881,
"acc_norm_stderr": 0.014583792546304038
},
"harness|hellaswag|10": {
"acc": 0.5796654052977495,
"acc_stderr": 0.0049260381977145225,
"acc_norm": 0.7757418840868353,
"acc_norm_stderr": 0.0041624039148053385
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.45660377358490567,
"acc_stderr": 0.030656748696739438,
"acc_norm": 0.45660377358490567,
"acc_norm_stderr": 0.030656748696739438
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835362,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835362
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.02264421261552521,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.02264421261552521
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4645161290322581,
"acc_stderr": 0.028372287797962956,
"acc_norm": 0.4645161290322581,
"acc_norm_stderr": 0.028372287797962956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512566,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512566
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5,
"acc_stderr": 0.035623524993954825,
"acc_norm": 0.5,
"acc_norm_stderr": 0.035623524993954825
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6062176165803109,
"acc_stderr": 0.035260770955482405,
"acc_norm": 0.6062176165803109,
"acc_norm_stderr": 0.035260770955482405
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4461538461538462,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.4461538461538462,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46218487394957986,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.46218487394957986,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6165137614678899,
"acc_stderr": 0.020847156641915977,
"acc_norm": 0.6165137614678899,
"acc_norm_stderr": 0.020847156641915977
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.028765111718046955,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.028765111718046955
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5,
"acc_stderr": 0.03509312031717982,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03509312031717982
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5780590717299579,
"acc_stderr": 0.032148146302403695,
"acc_norm": 0.5780590717299579,
"acc_norm_stderr": 0.032148146302403695
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5560538116591929,
"acc_stderr": 0.03334625674242728,
"acc_norm": 0.5560538116591929,
"acc_norm_stderr": 0.03334625674242728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.04338920305792401,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.04338920305792401
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49079754601226994,
"acc_stderr": 0.039277056007874414,
"acc_norm": 0.49079754601226994,
"acc_norm_stderr": 0.039277056007874414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.048467482539772386,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.048467482539772386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6709401709401709,
"acc_stderr": 0.030782321577688173,
"acc_norm": 0.6709401709401709,
"acc_norm_stderr": 0.030782321577688173
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6143039591315453,
"acc_stderr": 0.017406476619212907,
"acc_norm": 0.6143039591315453,
"acc_norm_stderr": 0.017406476619212907
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4884393063583815,
"acc_stderr": 0.026911898686377913,
"acc_norm": 0.4884393063583815,
"acc_norm_stderr": 0.026911898686377913
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4673202614379085,
"acc_stderr": 0.028568699752225875,
"acc_norm": 0.4673202614379085,
"acc_norm_stderr": 0.028568699752225875
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.02795048149440127,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.02795048149440127
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4783950617283951,
"acc_stderr": 0.027794760105008746,
"acc_norm": 0.4783950617283951,
"acc_norm_stderr": 0.027794760105008746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.028406627809590954,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.028406627809590954
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35723598435462844,
"acc_stderr": 0.012238615750316503,
"acc_norm": 0.35723598435462844,
"acc_norm_stderr": 0.012238615750316503
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.49264705882352944,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.49264705882352944,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.020036393768352638,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.020036393768352638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46122448979591835,
"acc_stderr": 0.03191282052669277,
"acc_norm": 0.46122448979591835,
"acc_norm_stderr": 0.03191282052669277
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.672514619883041,
"acc_stderr": 0.03599335771456027,
"acc_norm": 0.672514619883041,
"acc_norm_stderr": 0.03599335771456027
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3011015911872705,
"mc1_stderr": 0.016058999026100612,
"mc2": 0.4456721895962505,
"mc2_stderr": 0.014265726453599933
},
"harness|winogrande|5": {
"acc": 0.7419100236779794,
"acc_stderr": 0.012298278833972397
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460794,
"f1": 0.05649014261744978,
"f1_stderr": 0.0013342363586640303
},
"harness|gsm8k|5": {
"acc": 0.06292645943896892,
"acc_stderr": 0.0066887625815327395
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
azz1990/test2 | ---
license: apache-2.0
---
|
Supabase/dbpedia-openai-3-large-1M | ---
license: mit
dataset_info:
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 17782586772
num_examples: 1000000
download_size: 17782586772
dataset_size: 1000000
language:
- en
pretty_name: OpenAI text-embedding-3-large with 1M DBPedia Entities
size_categories:
- 1M<n<10M
---
1 million OpenAI Embeddings - 3072 dimensions
Created: February 2024.
Text used for Embedding: title (string) + text (string)
Embedding Model: text-embedding-3-large
## Credits:
This dataset was generated from the first 1M entries of https://huggingface.co/datasets/BeIR/dbpedia-entity |
loubnabnl/old_py | ---
dataset_info:
features:
- name: __id__
dtype: int64
- name: blob_id
dtype: string
- name: directory_id
dtype: string
- name: path
dtype: string
- name: content_id
dtype: string
- name: detected_licenses
sequence: string
- name: license_type
dtype: string
- name: repo_name
dtype: string
- name: repo_url
dtype: string
- name: snapshot_id
dtype: string
- name: revision_id
dtype: string
- name: branch_name
dtype: string
- name: visit_date
dtype: timestamp[ns]
- name: revision_date
dtype: timestamp[ns]
- name: committer_date
dtype: timestamp[ns]
- name: github_id
dtype: int64
- name: star_events_count
dtype: int64
- name: fork_events_count
dtype: int64
- name: gha_license_id
dtype: string
- name: gha_fork
dtype: bool
- name: gha_event_created_at
dtype: timestamp[ns]
- name: gha_created_at
dtype: timestamp[ns]
- name: gha_updated_at
dtype: timestamp[ns]
- name: gha_pushed_at
dtype: timestamp[ns]
- name: gha_size
dtype: int64
- name: gha_stargazers_count
dtype: int32
- name: gha_forks_count
dtype: int32
- name: gha_open_issues_count
dtype: int32
- name: gha_language
dtype: string
- name: gha_archived
dtype: bool
- name: gha_disabled
dtype: bool
- name: content
dtype: string
- name: src_encoding
dtype: string
- name: language
dtype: string
- name: is_vendor
dtype: bool
- name: is_generated
dtype: bool
- name: year
dtype: int64
splits:
- name: train
num_bytes: 4842783.826144089
num_examples: 1000
download_size: 2031848
dataset_size: 4842783.826144089
---
# Dataset Card for "old_py"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
librarian-bot/stats | ---
dataset_info:
features:
- name: createdAt
dtype: timestamp[us]
- name: pr_number
dtype: int64
- name: status
dtype: large_string
- name: repo_id
dtype: large_string
- name: type
dtype: large_string
- name: isPullRequest
dtype: bool
splits:
- name: train
num_bytes: 235762
num_examples: 2747
download_size: 95773
dataset_size: 235762
---
# Dataset Card for "stats"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ryan2009/Ig | ---
license: openrail
---
|
fromalanjones/fanfare | ---
license: openrail
---
|
TMZN/lunyu | ---
license: gpl-3.0
task_categories:
- question-answering
language:
- zh
pretty_name: lunyu
size_categories:
- 1K<n<10K
---
为https://huggingface.co/TMZN/ChatGLM-wyw 服务的数据集之一。
# ChatGLM-wyw
一个读了文言文的ChatGLM
# 缘起
2023年5月16日,念叨了好久要让AI读文言文正式开工。<br>
# 感谢
一站式整合包(含chatglm模型):链接:https://pan.baidu.com/s/13GePNuh8ZP_DkMVRf5sHqw?pwd=2d2z
一站式整合包(不含模型):链接:https://pan.baidu.com/s/1lMfG34jerHO7aFjfdKTGUw?pwd=6y7j
数据集制作大佬链接:https://github.com/huang1332/finetune_dataset_maker
模型微调大佬链接:https://github.com/mymusise/ChatGLM-Tuning
ChatGLM官方链接:https://github.com/THUDM/ChatGLM-6B
|
joey234/mmlu-abstract_algebra-neg-answer | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_answer
dtype: string
splits:
- name: test
num_bytes: 20641
num_examples: 100
download_size: 10947
dataset_size: 20641
---
# Dataset Card for "mmlu-abstract_algebra-neg-answer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Pablao0948/Austin_Mahone2 | ---
license: openrail
---
|
jason9693/APEACH | ---
annotations_creators:
- crowdsourced
- crowd-generated
language_creators:
- found
language:
- ko
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: apeach
pretty_name: 'APEACH'
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- binary-classification
---
# Dataset for project: kor_hate_eval(APEACH)

## Sample Code
<a href="https://colab.research.google.com/drive/1djd0fuoMYIaf7VCHaLQIziJi4_yBJruP#scrollTo=VPR24ysr5Q7k"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="base"/></a>
## Dataset Descritpion
Korean Hate Speech Evaluation Datasets : trained with [BEEP!](https://huggingface.co/datasets/kor_hate) and evaluate with [APEACH](https://github.com/jason9693/APEACH)
- **Repository: [Korean HateSpeech Evaluation Dataset](https://github.com/jason9693/APEACH)**
- **Paper: [APEACH: Attacking Pejorative Expressions with Analysis on Crowd-Generated Hate Speech Evaluation Datasets](https://arxiv.org/abs/2202.12459)**
- **Point of Contact: [Kichang Yang](ykcha9@gmail.com)**
### Languages
ko-KR
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
{'text': ['(현재 호텔주인 심정) 아18 난 마른하늘에 날벼락맞고 호텔망하게생겼는데 누군 계속 추모받네....',
'....한국적인 미인의 대표적인 분...너무나 곱고아름다운모습...그모습뒤의 슬픔을 미처 알지못했네요ㅠ'],
'class': ['Spoiled', 'Default']}
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"class": "ClassLabel(num_classes=2, names=['Default', 'Spoiled'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train (binarized BEEP!) | 7896 |
| valid (APEACH) | 3770 |
## Citation
```
@article{yang2022apeach,
title={APEACH: Attacking Pejorative Expressions with Analysis on Crowd-Generated Hate Speech Evaluation Datasets},
author={Yang, Kichang and Jang, Wonjun and Cho, Won Ik},
journal={arXiv preprint arXiv:2202.12459},
year={2022}
}
```
|
hearmeneigh/e621-rising-v1-raw | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1192534908282.634
num_examples: 2905671
download_size: 210413447679
dataset_size: 1192534908282.634
pretty_name: 'E621 Rising: Raw Image Dataset v1'
size_categories:
- 1M<n<10M
viewer: false
tags:
- not-for-all-audiences
---
> # Deprecation Notice!
> [This dataset has been superseded by v2](https://huggingface.co/datasets/hearmeneigh/e621-rising-v2-raw). Use v2 instead of this dataset.
**Warning: THIS dataset is NOT suitable for use by minors. The dataset contains X-rated/NFSW content.**
# E621 Rising: Raw Image Dataset v1
**2,905,671** images (~1.1TB) downloaded from `e621.net` with [tags](https://huggingface.co/datasets/hearmeneigh/e621-rising-v1-raw/raw/main/meta/tag-counts.json).
This is a raw, uncurated, and largely unprocessed dataset. You likely want to use the curated version, [available here](https://huggingface.co/datasets/hearmeneigh/e621-rising-v1-curated). This dataset contains all kinds of NFSW material. You have been warned.
## Image Processing
* Only `jpg` and `png` images were considered
* Image width and height have been clamped to `(0, 4096]px`; larger images have been resized to meet the limit
* Alpha channels have been removed
* All images have been converted to `jpg` format
* All images have been converted to TrueColor `RGB`
* All images have been verified to load with `Pillow`
* Metadata from E621 is [available here](https://huggingface.co/datasets/hearmeneigh/e621-rising-v1-raw/tree/main/meta).
## Tags
For a comprehensive list of tags and counts, [see here](https://huggingface.co/datasets/hearmeneigh/e621-rising-v1-raw/raw/main/meta/tag-counts.json).
### Changes From E621
* Symbols have been prefixed with `symbol:`, e.g. `symbol:<3`
* Aspect ratio has been prefixed with `aspect_ratio:`, e.g. `aspect_ratio:16_9`
* All categories except `general` have been prefixed with the category name, e.g. `artist:somename`. The categories are:
* `artist`
* `copyright`
* `character`
* `species`
* `invalid`
* `meta`
* `lore`
### Additional Tags
* Image rating
* `rating:explicit`
* `rating:questionable`
* `rating:safe`
* Image score
* `score:above_250`
* `score:above_500`
* `score:above_1000`
* `score:above_1500`
* `score:above_2000`
* `score:below_250`
* `score:below_100`
* `score:below_50`
* `score:below_25`
* `score:below_0`
* Image favorites
* `favorites:above_4000`
* `favorites:above_3000`
* `favorites:above_2000`
* `favorites:above_1000`
* `favorites:below_1000`
* `favorites:below_500`
* `favorites:below_250`
* `favorites:below_100`
* `favorites:below_50`
* `favorites:below_25` |
ShrinivasSK/te_en_1 | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: tgt
dtype: string
- name: src
dtype: string
splits:
- name: train
num_bytes: 4096206.9
num_examples: 18000
- name: test
num_bytes: 455134.1
num_examples: 2000
download_size: 2442401
dataset_size: 4551341.0
---
# Dataset Card for "te_en_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bertbsb/herbetbetovozmgi | ---
license: openrail
---
|
autoevaluate/autoeval-eval-samsum-samsum-417ba9-2386774739 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: ahmeddbahaa/xlmroberta-finetune-en-cnn
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: ahmeddbahaa/xlmroberta-finetune-en-cnn
* Dataset: samsum
* Config: samsum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
yangwang825/esc50 | ---
task_categories:
- audio-classification
tags:
- audio
size_categories:
- 1K<n<10K
---
# ESC50
## Dataset Summary
The ESC-50 dataset is a labeled collection of 2000 environmental audio recordings suitable for benchmarking methods of environmental sound classification. It comprises 2000 5s-clips of 50 different classes across natural, human and domestic sounds, again, drawn from Freesound.org.
## Data Instances
An example of 'train' looks as follows.
```
{
"audio": {
"path": "ESC-50-master/audio/4-143118-B-7.wav",
"array", array([0.05203247, 0.05285645, 0.05441284, ..., 0.0093689 , 0.00753784, 0.00643921],
"sampling_rate", 44100
},
"fold": 4,
"label": 30
}
``` |
danielshemesh/midjourney | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1134997116.24
num_examples: 4866
download_size: 702442852
dataset_size: 1134997116.24
---
# Dataset Card for "midjourney"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ChanceFocus/flare-fsrl | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: label
sequence: string
- name: token
sequence: string
splits:
- name: test
num_bytes: 200466
num_examples: 97
download_size: 69893
dataset_size: 200466
---
# Dataset Card for "flare-fsrl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nvidia/OpenMath-MATH-masked | ---
license: other
license_name: nvidia-license
task_categories:
- question-answering
- text-generation
language:
- en
tags:
- math
- nvidia
pretty_name: OpenMath MATH Masked
size_categories:
- 1K<n<10K
---
# OpenMath GSM8K Masked
We release a *masked* version of the [MATH](https://github.com/hendrycks/math) solutions.
This data can be used to aid synthetic generation of additional solutions for MATH dataset
as it is much less likely to lead to inconsistent reasoning compared to using
the original solutions directly.
This dataset was used to construct [OpenMathInstruct-1](https://huggingface.co/datasets/nvidia/OpenMathInstruct-1):
a math instruction tuning dataset with 1.8M problem-solution pairs
generated using permissively licensed [Mixtral-8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) model.
For details of how the masked solutions were created, see our [paper](https://arxiv.org/abs/2402.10176).
You can re-create this dataset or apply similar techniques to mask solutions for other datasets
by using our [open-sourced code](https://github.com/Kipok/NeMo-Skills).
## Citation
If you find our work useful, please consider citing us!
```bibtex
@article{toshniwal2024openmath,
title = {OpenMathInstruct-1: A 1.8 Million Math Instruction Tuning Dataset},
author = {Shubham Toshniwal and Ivan Moshkov and Sean Narenthiran and Daria Gitman and Fei Jia and Igor Gitman},
year = {2024},
journal = {arXiv preprint arXiv: Arxiv-2402.10176}
}
```
## License
The use of this dataset is governed by the [NVIDIA License](LICENSE) which permits commercial usage.
|
scholarly360/terrain_generation_from_sketch_for_game_assets | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_rte_zero_plural | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 766192
num_examples: 2100
- name: train
num_bytes: 686388
num_examples: 1797
download_size: 941495
dataset_size: 1452580
---
# Dataset Card for "MULTI_VALUE_rte_zero_plural"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mlsum | ---
annotations_creators:
- found
language_creators:
- found
language:
- de
- es
- fr
- ru
- tr
license:
- other
multilinguality:
- multilingual
size_categories:
- 100K<n<1M
- 10K<n<100K
source_datasets:
- extended|cnn_dailymail
- original
task_categories:
- summarization
- translation
- text-classification
task_ids:
- news-articles-summarization
- multi-class-classification
- multi-label-classification
- topic-classification
paperswithcode_id: mlsum
pretty_name: MLSUM
dataset_info:
- config_name: de
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: topic
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
splits:
- name: train
num_bytes: 846959840
num_examples: 220887
- name: validation
num_bytes: 47119541
num_examples: 11394
- name: test
num_bytes: 46847612
num_examples: 10701
download_size: 1005814154
dataset_size: 940926993
- config_name: es
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: topic
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
splits:
- name: train
num_bytes: 1214558302
num_examples: 266367
- name: validation
num_bytes: 50643400
num_examples: 10358
- name: test
num_bytes: 71263665
num_examples: 13920
download_size: 1456211154
dataset_size: 1336465367
- config_name: fr
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: topic
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
splits:
- name: train
num_bytes: 1471965014
num_examples: 392902
- name: validation
num_bytes: 70413212
num_examples: 16059
- name: test
num_bytes: 69660288
num_examples: 15828
download_size: 1849565564
dataset_size: 1612038514
- config_name: ru
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: topic
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
splits:
- name: train
num_bytes: 257389497
num_examples: 25556
- name: validation
num_bytes: 9128497
num_examples: 750
- name: test
num_bytes: 9656398
num_examples: 757
download_size: 766226107
dataset_size: 276174392
- config_name: tu
features:
- name: text
dtype: string
- name: summary
dtype: string
- name: topic
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
splits:
- name: train
num_bytes: 641622783
num_examples: 249277
- name: validation
num_bytes: 25530661
num_examples: 11565
- name: test
num_bytes: 27830212
num_examples: 12775
download_size: 942308960
dataset_size: 694983656
config_names:
- de
- es
- fr
- ru
- tu
---
# Dataset Card for MLSUM
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** []()
- **Repository:** https://github.com/recitalAI/MLSUM
- **Paper:** https://www.aclweb.org/anthology/2020.emnlp-main.647/
- **Point of Contact:** [email](thomas@recital.ai)
- **Size of downloaded dataset files:** 1.83 GB
- **Size of the generated dataset:** 4.86 GB
- **Total amount of disk used:** 6.69 GB
### Dataset Summary
We present MLSUM, the first large-scale MultiLingual SUMmarization dataset.
Obtained from online newspapers, it contains 1.5M+ article/summary pairs in five different languages -- namely, French, German, Spanish, Russian, Turkish.
Together with English newspapers from the popular CNN/Daily mail dataset, the collected data form a large scale multilingual dataset which can enable new research directions for the text summarization community.
We report cross-lingual comparative analyses based on state-of-the-art systems.
These highlight existing biases which motivate the use of a multi-lingual dataset.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### de
- **Size of downloaded dataset files:** 346.58 MB
- **Size of the generated dataset:** 940.93 MB
- **Total amount of disk used:** 1.29 GB
An example of 'validation' looks as follows.
```
{
"date": "01/01/2001",
"summary": "A text",
"text": "This is a text",
"title": "A sample",
"topic": "football",
"url": "https://www.google.com"
}
```
#### es
- **Size of downloaded dataset files:** 513.31 MB
- **Size of the generated dataset:** 1.34 GB
- **Total amount of disk used:** 1.85 GB
An example of 'validation' looks as follows.
```
{
"date": "01/01/2001",
"summary": "A text",
"text": "This is a text",
"title": "A sample",
"topic": "football",
"url": "https://www.google.com"
}
```
#### fr
- **Size of downloaded dataset files:** 619.99 MB
- **Size of the generated dataset:** 1.61 GB
- **Total amount of disk used:** 2.23 GB
An example of 'validation' looks as follows.
```
{
"date": "01/01/2001",
"summary": "A text",
"text": "This is a text",
"title": "A sample",
"topic": "football",
"url": "https://www.google.com"
}
```
#### ru
- **Size of downloaded dataset files:** 106.22 MB
- **Size of the generated dataset:** 276.17 MB
- **Total amount of disk used:** 382.39 MB
An example of 'train' looks as follows.
```
{
"date": "01/01/2001",
"summary": "A text",
"text": "This is a text",
"title": "A sample",
"topic": "football",
"url": "https://www.google.com"
}
```
#### tu
- **Size of downloaded dataset files:** 247.50 MB
- **Size of the generated dataset:** 694.99 MB
- **Total amount of disk used:** 942.48 MB
An example of 'train' looks as follows.
```
{
"date": "01/01/2001",
"summary": "A text",
"text": "This is a text",
"title": "A sample",
"topic": "football",
"url": "https://www.google.com"
}
```
### Data Fields
The data fields are the same among all splits.
#### de
- `text`: a `string` feature.
- `summary`: a `string` feature.
- `topic`: a `string` feature.
- `url`: a `string` feature.
- `title`: a `string` feature.
- `date`: a `string` feature.
#### es
- `text`: a `string` feature.
- `summary`: a `string` feature.
- `topic`: a `string` feature.
- `url`: a `string` feature.
- `title`: a `string` feature.
- `date`: a `string` feature.
#### fr
- `text`: a `string` feature.
- `summary`: a `string` feature.
- `topic`: a `string` feature.
- `url`: a `string` feature.
- `title`: a `string` feature.
- `date`: a `string` feature.
#### ru
- `text`: a `string` feature.
- `summary`: a `string` feature.
- `topic`: a `string` feature.
- `url`: a `string` feature.
- `title`: a `string` feature.
- `date`: a `string` feature.
#### tu
- `text`: a `string` feature.
- `summary`: a `string` feature.
- `topic`: a `string` feature.
- `url`: a `string` feature.
- `title`: a `string` feature.
- `date`: a `string` feature.
### Data Splits
|name|train |validation|test |
|----|-----:|---------:|----:|
|de |220887| 11394|10701|
|es |266367| 10358|13920|
|fr |392902| 16059|15828|
|ru | 25556| 750| 757|
|tu |249277| 11565|12775|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
Usage of dataset is restricted to non-commercial research purposes only. Copyright belongs to the original copyright holders. See https://github.com/recitalAI/MLSUM#mlsum
### Citation Information
```
@article{scialom2020mlsum,
title={MLSUM: The Multilingual Summarization Corpus},
author={Scialom, Thomas and Dray, Paul-Alexis and Lamprier, Sylvain and Piwowarski, Benjamin and Staiano, Jacopo},
journal={arXiv preprint arXiv:2004.14900},
year={2020}
}
```
### Contributions
Thanks to [@RachelKer](https://github.com/RachelKer), [@albertvillanova](https://github.com/albertvillanova), [@thomwolf](https://github.com/thomwolf) for adding this dataset. |
elskow/Weather4cast | ---
license: unlicense
---
# This repository contains the dataset of weather forecasting competition - Datavidia 2022
## Deskripsi File
- train.csv - Data yang digunakan untuk melatih model berisi fitur-fitur dan target
- train_hourly.csv - Data tambahan berisi fitur-fitur untuk setiap jam
- test.csv - Data uji yang berisi fitur-fitur untuk prediksi target
- test_hourly.csv - Data tambahan berisi fitur-fitur untuk setiap jam pada tanggal-tanggal yang termasuk dalam test.csv
- sample_submission.csv - File berisi contoh submisi untuk kompetisi ini
## Deskripsi Fitur
### train.csv
- time – Tanggal pencatatan
- temperature_2m_max (°C) – Temperatur udara tertinggi pada ketinggian 2 m di atas permukaan
- temperature_2m_min (°C) – Temperatur udara terendah pada ketinggian 2 m di atas permukaan
- apparent_temperature_max (°C) – Temperatur semu maksimum yang terasa
- apparent_temperature_min (°C) – Temperatur semu minimum yang terasa
- sunrise (iso8601) – Waktu matahari terbit pada hari itu dengan format ISO 8601
- sunset (iso8601) – Waktu matahari tenggelam pada hari itu dengan format ISO 8601
- shortwave_radiation_sum (MJ/m²) – Total radiasi matahari pada hari tersebut
- rain_sum (mm) – Jumlah curah hujan pada hari tersebut
- snowfall_sum (cm) – Jumlah hujan salju pada hari tersebut
- windspeed_10m_max (km/h) – Kecepatan angin maksimum pada ketinggian 10 m
- windgusts_10m_max (km/h) - Kecepatan angin minimum pada ketinggian 10 m
- winddirection_10m_dominant (°) – Arah angin dominan pada hari tersebut
- et0_fao_evapotranspiration (mm) – Jumlah evaporasi dan transpirasi pada hari tersebut
- elevation – Ketinggian kota yang tercatat
- city – Nama kota yang tercatat
### train_hourly.csv
- time – Tanggal dan jam pencatatan
- temperature_2m (°C) – Temperatur pada ketinggian 2 m
- relativehumidity_2m (%) – Kelembapan pada ketinggian 2 m
- dewpoint_2m (°C) – Titik embun; suhu ambang udara mengembun
- apparent_temperature (°C) – Temperatur semu yang dirasakan
- pressure_msl (hPa) – Tekanan udara pada ketinggian permukaan air laut rata-rata (mean sea level)
- surface_pressure (hPa) – Tekanan udara pada ketinggian permukaan daerah tersebut
- snowfall (cm) – Jumlah hujan salju pada jam tersebut
- cloudcover (%) – Persentase awan yang menutupi langit
- cloudcover_low (%) – Persentase cloud cover pada awan sampai ketinggian 2 km
- cloudcover_mid (%) – Persentase cloud cover pada ketinggian 2-6 km
- cloudcover_high (%) – Persentase cloud cover pada ketinggian di atas 6 km
- shortwave_radiation (W/m²) – Rata-rata energi pancaran matahari pada gelombang inframerah hingga ultraviolet
- direct_radiation (W/m²) – Rata-rata pancaran matahari langsung pada permukaan tanah seluas 1 m2
- diffuse_radiation (W/m²) – Rata-rata pancaran matahari yang dihamburkan oleh permukaan dan atmosfer
- direct_normal_irradiance (W/m²) – Rata-rata pancaran matahari langsung pada luas 1 m2 tegak lurus dengan arah pancaran
- windspeed_10m (km/h) – Kecepatan angin pada ketinggian 10 m
- windspeed_100m (km/h) – Kecepatan angin pada ketinggian 100 m
- winddirection_10m (°) – Arah angin pada ketinggian 10 m
- winddirection_100m (°) – Arah angin pada ketinggian 100 m
- windgusts_10m (km/h) – Kecepatan angin ketika terdapat angin kencang
- et0_fao_evapotranspiration (mm) – Jumlah evapotranspirasi (evaporasi dan transpirasi) pada jam tersebut
- vapor_pressure_deficit (kPa) – Perbedaan tekanan uap air dari udara dengan tekanan uap air ketika udara tersaturasi
- soil_temperature_0_to_7cm (°C) – Rata-rata temperatur tanah pada kedalaman 0-7 cm
- soil_temperature_7_to_28cm (°C) – Rata-rata temperatur tanah pada kedalaman 7-28 cm
- soil_temperature_28_to_100cm (°C) – Rata-rata temperatur tanah pada kedalaman 28-100 cm
- soil_temperature_100_to_255cm (°C) – Rata-rata temperatur tanah pada kedalaman 100-255 cm
- soil_moisture_0_to_7cm (m³/m³) – Rata-rata kelembapan air pada tanah untuk kedalaman 0-7 cm
- soil_moisture_7_to_28cm (m³/m³) – Rata-rata kelembapan air pada tanah untuk kedalaman 7-28 cm
- soil_moisture_28_to_100cm (m³/m³) – Rata-rata kelembapan air pada tanah untuk kedalaman 28-100 cm
- soil_moisture_100_to_255cm (m³/m³) – Rata-rata kelembapan air pada tanah untuk kedalaman 100-255 cm
- city – Nama kota |
AIML-TUDA/TEdBench_plusplus | ---
license: apache-2.0
task_categories:
- image-to-image
pretty_name: TEdBenc++
size_categories:
- n<1K
---
# TEdBench++
This dataset contains the TEdBench++ an image-to-image benchmark for text-based generative models. It contains original images (originals) and edited images (LEdits++) for benchmarking. ``tedbench++.csv`` contains the text-based edit instructions for the respective original image and parameters to reproduce the edited images with LEdits++.
|
FreedomIntelligence/Evol-Instruct-Chinese-GPT4 | ---
language:
- zh
size_categories:
- 100M<n<1B
task_categories:
- text-generation
- conversational
- text2text-generation
---
The dataset is created by (1) translating English questions of [Evol-instruct-70k](https://huggingface.co/datasets/WizardLM/WizardLM_evol_instruct_70k) into Chinese and (2) requesting GPT4 to generate Chinese responses.
For more details, please refer to:
- **Repository**:
- https://github.com/FreedomIntelligence/AceGPT
- https://github.com/FreedomIntelligence/LLMZoo
- **Paper**:
- [AceGPT, Localizing Large Language Models in Arabic](https://arxiv.org/abs/2309.12053)
- [Phoenix: Democratizing ChatGPT across Languages](https://arxiv.org/abs/2304.10453)
### BibTeX entry and citation info
```bibtex
@article{huang2023acegpt,
title={AceGPT, Localizing Large Language Models in Arabic},
author={Huang, Huang and Yu, Fei and Zhu, Jianqing and Sun, Xuening and Cheng, Hao and Song, Dingjie and Chen, Zhihong and Alharthi, Abdulmohsen and An, Bang and Liu, Ziche and others},
journal={arXiv preprint arXiv:2309.12053},
year={2023}
}
@article{chen2023phoenix,
title={Phoenix: Democratizing chatgpt across languages},
author={Chen, Zhihong and Jiang, Feng and Chen, Junying and Wang, Tiannan and Yu, Fei and Chen, Guiming and Zhang, Hongbo and Liang, Juhao and Zhang, Chen and Zhang, Zhiyi and others},
journal={arXiv preprint arXiv:2304.10453},
year={2023}
}
``` |
tasksource/lonli | ---
license: mit
task_ids:
- natural-language-inference
task_categories:
- text-classification
language:
- en
---
https://github.com/microsoft/LoNLI
```bibtex
@article{Tarunesh2021TrustingRO,
title={Trusting RoBERTa over BERT: Insights from CheckListing the Natural Language Inference Task},
author={Ishan Tarunesh and Somak Aditya and Monojit Choudhury},
journal={ArXiv},
year={2021},
volume={abs/2107.07229}
}
``` |
freshpearYoon/val_free_4 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604873992
num_examples: 10000
download_size: 1441610889
dataset_size: 9604873992
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sageofai/med_datavqa | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 218506996.0
num_examples: 2000
download_size: 497523831
dataset_size: 218506996.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
reralle/saa-march | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': arabic
'1': dutch
'2': french
'3': korean
'4': mandarin
'5': portuguese
'6': russian
'7': spanish
'8': uk
'9': usa
splits:
- name: train
num_bytes: 417147954.0
num_examples: 796
- name: test
num_bytes: 53551048.0
num_examples: 100
download_size: 462662864
dataset_size: 470699002.0
---
# Dataset Card for "saa-march"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
healthcorum/autotrain-data-tu9p-fvi7-zb2n | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: responses
dtype: string
- name: autotrain_text
dtype: string
splits:
- name: train
num_bytes: 36088167
num_examples: 9998
- name: validation
num_bytes: 36088167
num_examples: 9998
download_size: 12071286
dataset_size: 72176334
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "autotrain-data-tu9p-fvi7-zb2n"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Alsebay__NarumashiRTS-V2 | ---
pretty_name: Evaluation run of Alsebay/NarumashiRTS-V2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Alsebay/NarumashiRTS-V2](https://huggingface.co/Alsebay/NarumashiRTS-V2) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Alsebay__NarumashiRTS-V2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T20:44:50.479259](https://huggingface.co/datasets/open-llm-leaderboard/details_Alsebay__NarumashiRTS-V2/blob/main/results_2024-04-15T20-44-50.479259.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6456790360227179,\n\
\ \"acc_stderr\": 0.032200447788181916,\n \"acc_norm\": 0.6463296279288366,\n\
\ \"acc_norm_stderr\": 0.03285452814707287,\n \"mc1\": 0.4773561811505508,\n\
\ \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6453581106823452,\n\
\ \"mc2_stderr\": 0.015459952731749608\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6527303754266212,\n \"acc_stderr\": 0.013913034529620446,\n\
\ \"acc_norm\": 0.6851535836177475,\n \"acc_norm_stderr\": 0.01357265770308495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6795459071898028,\n\
\ \"acc_stderr\": 0.004656974162147998,\n \"acc_norm\": 0.8614817765385382,\n\
\ \"acc_norm_stderr\": 0.003447370972192066\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n\
\ \"acc_stderr\": 0.04724007352383887,\n \"acc_norm\": 0.3431372549019608,\n\
\ \"acc_norm_stderr\": 0.04724007352383887\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n\
\ \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"\
acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406786,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406786\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.044631127206771704,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.044631127206771704\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642514,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593556,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593556\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8568807339449541,\n \"acc_stderr\": 0.01501446249716859,\n \"\
acc_norm\": 0.8568807339449541,\n \"acc_norm_stderr\": 0.01501446249716859\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926913,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926913\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464074,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464074\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.02353292543104429,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.02353292543104429\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42793296089385474,\n\
\ \"acc_stderr\": 0.016547887997416112,\n \"acc_norm\": 0.42793296089385474,\n\
\ \"acc_norm_stderr\": 0.016547887997416112\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n\
\ \"acc_stderr\": 0.02521804037341063,\n \"acc_norm\": 0.729903536977492,\n\
\ \"acc_norm_stderr\": 0.02521804037341063\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\
\ \"acc_stderr\": 0.012734923579532067,\n \"acc_norm\": 0.46284224250325945,\n\
\ \"acc_norm_stderr\": 0.012734923579532067\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6470588235294118,\n \"acc_stderr\": 0.01933314202079716,\n \
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.01933314202079716\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616915,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616915\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4773561811505508,\n\
\ \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6453581106823452,\n\
\ \"mc2_stderr\": 0.015459952731749608\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626918\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6709628506444276,\n \
\ \"acc_stderr\": 0.01294237560367937\n }\n}\n```"
repo_url: https://huggingface.co/Alsebay/NarumashiRTS-V2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|arc:challenge|25_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|gsm8k|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hellaswag|10_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T20-44-50.479259.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T20-44-50.479259.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- '**/details_harness|winogrande|5_2024-04-15T20-44-50.479259.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T20-44-50.479259.parquet'
- config_name: results
data_files:
- split: 2024_04_15T20_44_50.479259
path:
- results_2024-04-15T20-44-50.479259.parquet
- split: latest
path:
- results_2024-04-15T20-44-50.479259.parquet
---
# Dataset Card for Evaluation run of Alsebay/NarumashiRTS-V2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Alsebay/NarumashiRTS-V2](https://huggingface.co/Alsebay/NarumashiRTS-V2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Alsebay__NarumashiRTS-V2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T20:44:50.479259](https://huggingface.co/datasets/open-llm-leaderboard/details_Alsebay__NarumashiRTS-V2/blob/main/results_2024-04-15T20-44-50.479259.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6456790360227179,
"acc_stderr": 0.032200447788181916,
"acc_norm": 0.6463296279288366,
"acc_norm_stderr": 0.03285452814707287,
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6453581106823452,
"mc2_stderr": 0.015459952731749608
},
"harness|arc:challenge|25": {
"acc": 0.6527303754266212,
"acc_stderr": 0.013913034529620446,
"acc_norm": 0.6851535836177475,
"acc_norm_stderr": 0.01357265770308495
},
"harness|hellaswag|10": {
"acc": 0.6795459071898028,
"acc_stderr": 0.004656974162147998,
"acc_norm": 0.8614817765385382,
"acc_norm_stderr": 0.003447370972192066
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383887,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383887
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.025446365634406786,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.025446365634406786
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.044631127206771704,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.044631127206771704
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642514,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593556,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593556
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8568807339449541,
"acc_stderr": 0.01501446249716859,
"acc_norm": 0.8568807339449541,
"acc_norm_stderr": 0.01501446249716859
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926913,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926913
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464074,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464074
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.02353292543104429,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.02353292543104429
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42793296089385474,
"acc_stderr": 0.016547887997416112,
"acc_norm": 0.42793296089385474,
"acc_norm_stderr": 0.016547887997416112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341063,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532067,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.01933314202079716,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.01933314202079716
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616915,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616915
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4773561811505508,
"mc1_stderr": 0.01748554225848965,
"mc2": 0.6453581106823452,
"mc2_stderr": 0.015459952731749608
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626918
},
"harness|gsm8k|5": {
"acc": 0.6709628506444276,
"acc_stderr": 0.01294237560367937
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mmoebis/5gdata_2 | ---
dataset_info:
features:
- name: Sentences
dtype: string
- name: Questions
dtype: string
- name: __index_level_0__
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 273713
num_examples: 663
download_size: 11659
dataset_size: 273713
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
adamzinebi/hiphop | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 117189578
num_examples: 7381
download_size: 14758588
dataset_size: 117189578
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
atsushi015/samples | ---
license: creativeml-openrail-m
---
|
Atipico1/mrqa-adv-test-adv-gpt-passage-entity | ---
dataset_info:
features:
- name: subset
dtype: string
- name: qid
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: masked_query
dtype: string
- name: context
dtype: string
- name: answer_sent
dtype: string
- name: answer_in_context
sequence: string
- name: entity
dtype: string
- name: similar_entity
dtype: string
- name: clear_answer_sent
dtype: string
- name: vague_answer_sent
dtype: string
- name: adversary
dtype: string
- name: replace_count
dtype: int64
- name: adversarial_passage
dtype: string
- name: masked_answer_sent
dtype: string
- name: num_mask_token
dtype: int64
- name: entities
sequence: string
- name: gpt_adv_sent
dtype: string
splits:
- name: train
num_bytes: 2063372
num_examples: 1000
download_size: 1360341
dataset_size: 2063372
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/data-standardized_cluster_8 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 45005561
num_examples: 4422
download_size: 12745413
dataset_size: 45005561
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-standardized_cluster_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Steelskull__Umbra-MoE-4x10.7 | ---
pretty_name: Evaluation run of Steelskull/Umbra-MoE-4x10.7
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Steelskull/Umbra-MoE-4x10.7](https://huggingface.co/Steelskull/Umbra-MoE-4x10.7)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Steelskull__Umbra-MoE-4x10.7\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T00:54:53.184339](https://huggingface.co/datasets/open-llm-leaderboard/details_Steelskull__Umbra-MoE-4x10.7/blob/main/results_2024-01-21T00-54-53.184339.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6674032750190299,\n\
\ \"acc_stderr\": 0.0314926889496487,\n \"acc_norm\": 0.6684896093314947,\n\
\ \"acc_norm_stderr\": 0.03213090427046816,\n \"mc1\": 0.5324357405140759,\n\
\ \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6782098863716366,\n\
\ \"mc2_stderr\": 0.015273304296026847\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6715017064846417,\n \"acc_stderr\": 0.013724978465537302,\n\
\ \"acc_norm\": 0.7030716723549488,\n \"acc_norm_stderr\": 0.013352025976725228\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7002589125672177,\n\
\ \"acc_stderr\": 0.0045720816569656455,\n \"acc_norm\": 0.8781119298944433,\n\
\ \"acc_norm_stderr\": 0.0032648787375868854\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800886,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800886\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130726,\n \"\
acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130726\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n\
\ \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.8129032258064516,\n\
\ \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942084,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942084\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603347,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603347\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941183,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941183\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857403,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857403\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.028657491285071987,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.028657491285071987\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590172,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590172\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5787037037037037,\n \"acc_stderr\": 0.033674621388960775,\n \"\
acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.033674621388960775\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801584,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801584\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \
\ \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596915,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596915\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.01358661921990333,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.01358661921990333\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4044692737430168,\n\
\ \"acc_stderr\": 0.01641444091729315,\n \"acc_norm\": 0.4044692737430168,\n\
\ \"acc_norm_stderr\": 0.01641444091729315\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.024288619466046102,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.024288619466046102\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.0254942593506949,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.0254942593506949\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49282920469361147,\n\
\ \"acc_stderr\": 0.012768922739553304,\n \"acc_norm\": 0.49282920469361147,\n\
\ \"acc_norm_stderr\": 0.012768922739553304\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n\
\ \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.696078431372549,\n \"acc_stderr\": 0.01860755213127983,\n \
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.01860755213127983\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533207,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533207\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5324357405140759,\n\
\ \"mc1_stderr\": 0.017466632149577613,\n \"mc2\": 0.6782098863716366,\n\
\ \"mc2_stderr\": 0.015273304296026847\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828075\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6474601971190296,\n \
\ \"acc_stderr\": 0.013159909755930333\n }\n}\n```"
repo_url: https://huggingface.co/Steelskull/Umbra-MoE-4x10.7
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|arc:challenge|25_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|gsm8k|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hellaswag|10_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T00-54-53.184339.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T00-54-53.184339.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- '**/details_harness|winogrande|5_2024-01-21T00-54-53.184339.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T00-54-53.184339.parquet'
- config_name: results
data_files:
- split: 2024_01_21T00_54_53.184339
path:
- results_2024-01-21T00-54-53.184339.parquet
- split: latest
path:
- results_2024-01-21T00-54-53.184339.parquet
---
# Dataset Card for Evaluation run of Steelskull/Umbra-MoE-4x10.7
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Steelskull/Umbra-MoE-4x10.7](https://huggingface.co/Steelskull/Umbra-MoE-4x10.7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Steelskull__Umbra-MoE-4x10.7",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T00:54:53.184339](https://huggingface.co/datasets/open-llm-leaderboard/details_Steelskull__Umbra-MoE-4x10.7/blob/main/results_2024-01-21T00-54-53.184339.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6674032750190299,
"acc_stderr": 0.0314926889496487,
"acc_norm": 0.6684896093314947,
"acc_norm_stderr": 0.03213090427046816,
"mc1": 0.5324357405140759,
"mc1_stderr": 0.017466632149577613,
"mc2": 0.6782098863716366,
"mc2_stderr": 0.015273304296026847
},
"harness|arc:challenge|25": {
"acc": 0.6715017064846417,
"acc_stderr": 0.013724978465537302,
"acc_norm": 0.7030716723549488,
"acc_norm_stderr": 0.013352025976725228
},
"harness|hellaswag|10": {
"acc": 0.7002589125672177,
"acc_stderr": 0.0045720816569656455,
"acc_norm": 0.8781119298944433,
"acc_norm_stderr": 0.0032648787375868854
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800886,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800886
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47883597883597884,
"acc_stderr": 0.025728230952130726,
"acc_norm": 0.47883597883597884,
"acc_norm_stderr": 0.025728230952130726
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.023253157951942084,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.023253157951942084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603347,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603347
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941183,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941183
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857403,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857403
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.028657491285071987,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.028657491285071987
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590172,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590172
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.033674621388960775,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.033674621388960775
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801584,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801584
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596915,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596915
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.01358661921990333,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.01358661921990333
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4044692737430168,
"acc_stderr": 0.01641444091729315,
"acc_norm": 0.4044692737430168,
"acc_norm_stderr": 0.01641444091729315
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.024288619466046102,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.024288619466046102
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.0254942593506949,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.0254942593506949
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49282920469361147,
"acc_stderr": 0.012768922739553304,
"acc_norm": 0.49282920469361147,
"acc_norm_stderr": 0.012768922739553304
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7536764705882353,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.7536764705882353,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.01860755213127983,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.01860755213127983
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533207,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533207
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5324357405140759,
"mc1_stderr": 0.017466632149577613,
"mc2": 0.6782098863716366,
"mc2_stderr": 0.015273304296026847
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828075
},
"harness|gsm8k|5": {
"acc": 0.6474601971190296,
"acc_stderr": 0.013159909755930333
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
teowu/LLVisionQA-QBench | ---
license: cc-by-nc-sa-4.0
---
Dataset for Paper: **Q-Bench: A Benchmark for General-Purpose Foundation Models on Low-level Vision**.
*Images*: `images.tar`
`dev`-*labels*: `llvisionqa_dev.json`
`test`-*labels*: `llvisionqa_test.json`
See Github for Usage: https://github.com/vqassessment/q-bench.
Feel free to cite us.
```bibtex
@article{wu2023qbench,
title={Q-Bench: A Benchmark for General-Purpose Foundation Models on Low-level Vision},
author={Wu, Haoning and Zhang, Zicheng and Zhang, Erli and Chen, Chaofeng and Liao, Liang and Wang, Annan and Li, Chunyi and Sun, Wenxiu and Yan, Qiong and Zhai, Guangtao and Lin, Weisi},
year={2023},
eprint={2309.14181},
}
``` |
whu9/medsum_train_512 | ---
dataset_info:
features:
- name: source
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 161451994.94889495
num_examples: 17259
download_size: 16034976
dataset_size: 161451994.94889495
---
# Dataset Card for "medsum_train_512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
michaelmallari/nfl | ---
license: mit
---
|
ovior/twitter_dataset_1713171127 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2738737
num_examples: 7825
download_size: 1589848
dataset_size: 2738737
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ghiffaryr/qna-japanese | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1687978939.2
num_examples: 7259472
- name: validation
num_bytes: 210997367.4
num_examples: 907434
- name: test
num_bytes: 210997367.4
num_examples: 907434
download_size: 1174609259
dataset_size: 2109973674.0000002
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
yashtiwari/fleurs-hi-en-ST | ---
dataset_info:
features:
- name: id
dtype: int64
- name: hindi
dtype: string
- name: english
dtype: string
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
splits:
- name: train
num_bytes: 1286250983
num_examples: 876
download_size: 824653765
dataset_size: 1286250983
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "fleurs-hi-en-ST"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
This is a dataset for speech to text translation of hindi to english. dataset used to build this was fleurs & flores |
ZAYNBAKA/Rogerio | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_cola_if_would | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 397
num_examples: 5
- name: test
num_bytes: 454
num_examples: 5
- name: train
num_bytes: 6401
num_examples: 77
download_size: 9522
dataset_size: 7252
---
# Dataset Card for "MULTI_VALUE_cola_if_would"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kjappelbaum/chemnlp-ocp | ---
dataset_info:
features:
- name: id
dtype: string
- name: target
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 233206947
num_examples: 100000
- name: valid
num_bytes: 57773992
num_examples: 25000
download_size: 88580458
dataset_size: 290980939
---
# Dataset Card for "chemnlp-ocp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vwxyzjn/test-dataset-bug2 | ---
dataset_info:
features:
- name: data
sequence: int64
splits:
- name: remove_CritiqueRequest_10_18_2023_1697667530
num_bytes: 40
num_examples: 2
download_size: 1065
dataset_size: 40
configs:
- config_name: default
data_files:
- split: remove_CritiqueRequest_10_18_2023_1697667530
path: data/remove_CritiqueRequest_10_18_2023_1697667530-*
---
# Dataset Card for "test-dataset-bug2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Isotonic__TinyMixtral-4x248M-MoE | ---
pretty_name: Evaluation run of Isotonic/TinyMixtral-4x248M-MoE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Isotonic/TinyMixtral-4x248M-MoE](https://huggingface.co/Isotonic/TinyMixtral-4x248M-MoE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Isotonic__TinyMixtral-4x248M-MoE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-08T22:20:25.743847](https://huggingface.co/datasets/open-llm-leaderboard/details_Isotonic__TinyMixtral-4x248M-MoE/blob/main/results_2024-04-08T22-20-25.743847.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25062534015174737,\n\
\ \"acc_stderr\": 0.030703691746189168,\n \"acc_norm\": 0.25194345598726114,\n\
\ \"acc_norm_stderr\": 0.03152544108870022,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871093,\n \"mc2\": 0.483038971320109,\n\
\ \"mc2_stderr\": 0.01652857847450844\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.21075085324232082,\n \"acc_stderr\": 0.01191827175485218,\n\
\ \"acc_norm\": 0.27474402730375425,\n \"acc_norm_stderr\": 0.013044617212771227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2548297151961761,\n\
\ \"acc_stderr\": 0.004348748730529936,\n \"acc_norm\": 0.25433180641306513,\n\
\ \"acc_norm_stderr\": 0.004345949382382375\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.02937917046412482,\n\
\ \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.02937917046412482\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.021411684393694196,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.021411684393694196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604671,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604671\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.18064516129032257,\n\
\ \"acc_stderr\": 0.021886178567172548,\n \"acc_norm\": 0.18064516129032257,\n\
\ \"acc_norm_stderr\": 0.021886178567172548\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.031447125816782405,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.031447125816782405\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25757575757575757,\n \"acc_stderr\": 0.03115626951964684,\n \"\
acc_norm\": 0.25757575757575757,\n \"acc_norm_stderr\": 0.03115626951964684\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.18134715025906736,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.18134715025906736,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.02037766097037138,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.02037766097037138\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473836,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473836\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.20733944954128442,\n \"acc_stderr\": 0.01738141556360866,\n \"\
acc_norm\": 0.20733944954128442,\n \"acc_norm_stderr\": 0.01738141556360866\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16666666666666666,\n \"acc_stderr\": 0.025416428388767485,\n \"\
acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.025416428388767485\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n\
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3542600896860987,\n\
\ \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.3542600896860987,\n\
\ \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052191,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052191\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615767,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615767\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2822477650063857,\n\
\ \"acc_stderr\": 0.016095302969878555,\n \"acc_norm\": 0.2822477650063857,\n\
\ \"acc_norm_stderr\": 0.016095302969878555\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21895424836601307,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.21895424836601307,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21543408360128619,\n\
\ \"acc_stderr\": 0.02335022547547143,\n \"acc_norm\": 0.21543408360128619,\n\
\ \"acc_norm_stderr\": 0.02335022547547143\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.025407197798890155,\n\
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.025407197798890155\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590638,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590638\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24641460234680573,\n\
\ \"acc_stderr\": 0.011005971399927235,\n \"acc_norm\": 0.24641460234680573,\n\
\ \"acc_norm_stderr\": 0.011005971399927235\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4522058823529412,\n \"acc_stderr\": 0.030233758551596452,\n\
\ \"acc_norm\": 0.4522058823529412,\n \"acc_norm_stderr\": 0.030233758551596452\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2581699346405229,\n \"acc_stderr\": 0.017704531653250068,\n \
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.017704531653250068\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n\
\ \"acc_stderr\": 0.03384429155233137,\n \"acc_norm\": 0.25301204819277107,\n\
\ \"acc_norm_stderr\": 0.03384429155233137\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871093,\n \"mc2\": 0.483038971320109,\n\
\ \"mc2_stderr\": 0.01652857847450844\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.48697711128650356,\n \"acc_stderr\": 0.014047718393997674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Isotonic/TinyMixtral-4x248M-MoE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|arc:challenge|25_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|gsm8k|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hellaswag|10_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-20-25.743847.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T22-20-25.743847.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- '**/details_harness|winogrande|5_2024-04-08T22-20-25.743847.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-08T22-20-25.743847.parquet'
- config_name: results
data_files:
- split: 2024_04_08T22_20_25.743847
path:
- results_2024-04-08T22-20-25.743847.parquet
- split: latest
path:
- results_2024-04-08T22-20-25.743847.parquet
---
# Dataset Card for Evaluation run of Isotonic/TinyMixtral-4x248M-MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Isotonic/TinyMixtral-4x248M-MoE](https://huggingface.co/Isotonic/TinyMixtral-4x248M-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Isotonic__TinyMixtral-4x248M-MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-08T22:20:25.743847](https://huggingface.co/datasets/open-llm-leaderboard/details_Isotonic__TinyMixtral-4x248M-MoE/blob/main/results_2024-04-08T22-20-25.743847.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25062534015174737,
"acc_stderr": 0.030703691746189168,
"acc_norm": 0.25194345598726114,
"acc_norm_stderr": 0.03152544108870022,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871093,
"mc2": 0.483038971320109,
"mc2_stderr": 0.01652857847450844
},
"harness|arc:challenge|25": {
"acc": 0.21075085324232082,
"acc_stderr": 0.01191827175485218,
"acc_norm": 0.27474402730375425,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.2548297151961761,
"acc_stderr": 0.004348748730529936,
"acc_norm": 0.25433180641306513,
"acc_norm_stderr": 0.004345949382382375
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.02937917046412482,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.02937917046412482
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.021411684393694196,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.021411684393694196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604671,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604671
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18064516129032257,
"acc_stderr": 0.021886178567172548,
"acc_norm": 0.18064516129032257,
"acc_norm_stderr": 0.021886178567172548
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.031447125816782405,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.031447125816782405
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25757575757575757,
"acc_stderr": 0.03115626951964684,
"acc_norm": 0.25757575757575757,
"acc_norm_stderr": 0.03115626951964684
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.18134715025906736,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.18134715025906736,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.02037766097037138,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.02037766097037138
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473836,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473836
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.20733944954128442,
"acc_stderr": 0.01738141556360866,
"acc_norm": 0.20733944954128442,
"acc_norm_stderr": 0.01738141556360866
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.025416428388767485,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.025416428388767485
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3542600896860987,
"acc_stderr": 0.03210062154134987,
"acc_norm": 0.3542600896860987,
"acc_norm_stderr": 0.03210062154134987
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052191,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052191
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615767,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615767
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2822477650063857,
"acc_stderr": 0.016095302969878555,
"acc_norm": 0.2822477650063857,
"acc_norm_stderr": 0.016095302969878555
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21895424836601307,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.21895424836601307,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.21543408360128619,
"acc_stderr": 0.02335022547547143,
"acc_norm": 0.21543408360128619,
"acc_norm_stderr": 0.02335022547547143
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.025407197798890155,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.025407197798890155
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590638,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590638
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24641460234680573,
"acc_stderr": 0.011005971399927235,
"acc_norm": 0.24641460234680573,
"acc_norm_stderr": 0.011005971399927235
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4522058823529412,
"acc_stderr": 0.030233758551596452,
"acc_norm": 0.4522058823529412,
"acc_norm_stderr": 0.030233758551596452
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.017704531653250068,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.017704531653250068
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.03384429155233137,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.03384429155233137
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871093,
"mc2": 0.483038971320109,
"mc2_stderr": 0.01652857847450844
},
"harness|winogrande|5": {
"acc": 0.48697711128650356,
"acc_stderr": 0.014047718393997674
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
hhhwmws/jiumozhi | ---
license: cc-by-4.0
task_categories:
- text-generation
language:
- zh
size_categories:
- 1K<n<10K
---
支持ChatHaruhi2 的鸠摩智数据,可以使用如下方式调用
```python
from chatharuhi import ChatHaruhi
chatbot = ChatHaruhi( role_from_hf = 'hhhwmws/jiumozhi', \
llm = 'openai')
response = chatbot.chat(role='萧峰', text = '是我!')
print(response)
```
上传者: 米唯实
更具体的信息,见 [ChatHaruhi](https://github.com/LC1332/Chat-Haruhi-Suzumiya)
欢迎加入我们的 [众筹角色创建项目](https://github.com/LC1332/Chat-Haruhi-Suzumiya/tree/main/characters/novel_collecting)
### Citation引用
Please cite the repo if you use the data or code in this repo.
```
@misc{li2023chatharuhi,
title={ChatHaruhi: Reviving Anime Character in Reality via Large Language Model},
author={Cheng Li and Ziang Leng and Chenxi Yan and Junyi Shen and Hao Wang and Weishi MI and Yaying Fei and Xiaoyang Feng and Song Yan and HaoSheng Wang and Linkang Zhan and Yaokai Jia and Pingyu Wu and Haozhen Sun},
year={2023},
eprint={2308.09597},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
joey234/mmlu-human_sexuality-original-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 7357
num_examples: 13
download_size: 10425
dataset_size: 7357
---
# Dataset Card for "mmlu-human_sexuality-original-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_rare_v5_full_recite_full_passage_random_permute_rerun_8 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 11077443.860400446
num_examples: 6305
- name: validation
num_bytes: 582950
num_examples: 300
download_size: 1813096
dataset_size: 11660393.860400446
---
# Dataset Card for "squad_qa_rare_v5_full_recite_full_passage_random_permute_rerun_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
csad2023/flodata | ---
license: apache-2.0
---
|
swamisharan/deduplicated_dataset | ---
dataset_info:
features:
- name: condition
dtype: string
- name: instruction
dtype: string
- name: system
dtype: string
- name: response
dtype: string
- name: _task_name
dtype: string
- name: _task_source
dtype: string
splits:
- name: train
num_bytes: 2408609642
num_examples: 1575418
download_size: 901442891
dataset_size: 2408609642
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-launch__gov_report-plain_text-c8c9c8-1465553968 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- launch/gov_report
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-base-16384-booksum-V12
metrics: []
dataset_name: launch/gov_report
dataset_config: plain_text
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-base-16384-booksum-V12
* Dataset: launch/gov_report
* Config: plain_text
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
Danieldlima21/youtopia | ---
license: openrail
---
|
CyberHarem/fubuki_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of fubuki/合歓垣フブキ/吹雪 (Blue Archive)
This is the dataset of fubuki/合歓垣フブキ/吹雪 (Blue Archive), containing 270 images and their tags.
The core tags of this character are `multicolored_hair, long_hair, blue_hair, streaked_hair, antenna_hair, halo, twintails, red_eyes, hair_bow, bow, grey_hair, hat, hair_ornament, white_bow, pink_halo, heart_hair_ornament, blue_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 270 | 388.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fubuki_bluearchive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 270 | 336.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fubuki_bluearchive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 699 | 709.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fubuki_bluearchive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fubuki_bluearchive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, armband, blue_necktie, blue_vest, blush, doughnut, holding_food, long_sleeves, open_jacket, simple_background, solo, white_background, white_jacket, white_shirt, black_pantyhose, blue_skirt, collared_shirt, eating, looking_at_viewer, smile, heart |
| 1 | 10 |  |  |  |  |  | 1girl, blue_necktie, blue_vest, doughnut, long_sleeves, looking_at_viewer, sitting, smile, solo, white_jacket, black_pantyhose, holding_food, open_mouth, white_shirt, armband, blue_skirt, collared_shirt, open_clothes, uniform |
| 2 | 10 |  |  |  |  |  | 1girl, detached_collar, looking_at_viewer, playboy_bunny, strapless_leotard, alternate_costume, bare_shoulders, blush, simple_background, solo, white_background, blue_leotard, open_mouth, rabbit_ears, smile, small_breasts, doughnut, fake_animal_ears, open_jacket, white_jacket, black_pantyhose, blue_necktie, covered_navel, heart, holding_food, long_sleeves, hair_between_eyes, off_shoulder |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | armband | blue_necktie | blue_vest | blush | doughnut | holding_food | long_sleeves | open_jacket | simple_background | solo | white_background | white_jacket | white_shirt | black_pantyhose | blue_skirt | collared_shirt | eating | looking_at_viewer | smile | heart | sitting | open_mouth | open_clothes | uniform | detached_collar | playboy_bunny | strapless_leotard | alternate_costume | bare_shoulders | blue_leotard | rabbit_ears | small_breasts | fake_animal_ears | covered_navel | hair_between_eyes | off_shoulder |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:---------------|:------------|:--------|:-----------|:---------------|:---------------|:--------------|:--------------------|:-------|:-------------------|:---------------|:--------------|:------------------|:-------------|:-----------------|:---------|:--------------------|:--------|:--------|:----------|:-------------|:---------------|:----------|:------------------|:----------------|:--------------------|:--------------------|:-----------------|:---------------|:--------------|:----------------|:-------------------|:----------------|:--------------------|:---------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | | X | X | X | | | X | | X | X | X | X | X | | X | X | | X | X | X | X | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | | X | | X | X | X | X | X | X | X | X | X | | X | | | | X | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
zhan1993/transfer_matrix_v3 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: expert_name
dtype: string
- name: task_eval_on
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 5191944
num_examples: 68989
download_size: 1047085
dataset_size: 5191944
---
# Dataset Card for "transfer_matrix_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
iarbel/amazon-product-data-filter | ---
dataset_info:
features:
- name: asin
dtype: string
- name: category
dtype: string
- name: img_url
dtype: string
- name: title
dtype: string
- name: feature-bullets
sequence: string
- name: tech_data
sequence:
sequence: string
- name: labels
dtype: string
- name: tech_process
dtype: string
splits:
- name: train
num_bytes: 2686223
num_examples: 716
- name: validation
num_bytes: 763820
num_examples: 204
- name: test
num_bytes: 390684
num_examples: 103
download_size: 2162385
dataset_size: 3840727
license: cc-by-nc-4.0
task_categories:
- text-generation
language:
- en
size_categories:
- 1K<n<10K
---
# Dataset Card for "amazon-product-data-filter"
## Dataset Description
- **Homepage:** [τenai.io - AI Consulting](https://www.tenai.io/)
- **Point of Contact:** [Iftach Arbel](mailto:ia@momentum-ai.io)
### Dataset Summary
The Amazon Product Dataset contains product listing data from the Amazon US website. It can be used for various NLP and classification tasks, such as text generation, product type classification, attribute extraction, image recognition and more.
### Languages
The text in the dataset is in English.
## Dataset Structure
### Data Instances
Each data point provides product information, such as ASIN (Amazon Standard Identification Number), title, feature-bullets, and more.
### Data Fields
- `asin`: Amazon Standard Identification Number.
- `category`: The product category. This field represents the search-string used to obtain the listing, it is not the product category as appears on Amazon.com.
- `img_url`: Main image URL from the product page.
- `title`: Product title, as appears on the product page.
- `feature-bullets`: Product feature-bullets list, as they appear on the product page.
- `tech_data`: Product technical data (material, style, etc.), as they appear on the product page. Structured as a list of tuples, where the first element is a feature (e.g. material) and the second element is a value (e.g. plastic).
- `labels`: A processed instance of `feature-bullets` field. The original feature-bullets were aligned to form a standard structure with a capitalized prefix, remove emojis, etc. Finally, the list items were concatenated to a single string with a `\n` seperator.
- `tech_process`: A processed instance of `tech_data` field. The original tech data was filtered and transformed from a `(key, value)` structure to a natural language text.
### Data Splits
The dataset was randomly split into train (70%), validation (20%), test (10%). Since the main usage is text-generation, the train split is to be used for fine-tuning or as a few-shot context. The validation split can be used for tracking perplexity during fine-tuning. The test split should be used to generate text and inspect quality of results.
## Dataset Creation
### Curation Rationale
This dataset was built to provide high-quality data in the e-commerce domain, and fine-tuning LLMs for specific tasks. Raw, unstractured data was collected from Amazom.com, parsed, processed, and filtered using various techniques (annotations, rule-based, models).
### Source Data
#### Initial Data Collection and Normalization
The data was obtained by collected raw HTML data from Amazom.com.
### Annotations
The dataset does not contain any additional annotations.
### Personal and Sensitive Information
There is no personal information in the dataset.
## Considerations for Using the Data
### Social Impact of Dataset
To the best of our knowledge, there is no social impact for this dataset. The data is highly technical, and usage for product text-generation or classification does not pose a risk.
### Other Known Limitations
The quality of product listings may vary, and may not be accurate.
## Additional Information
### Dataset Curators
The dataset was collected and curated by [Iftach Arbel](mailto:ia@momentum-ai.io).
### Licensing Information
The dataset is available under the [Creative Commons NonCommercial (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/legalcode).
### Citation Information
```
@misc{amazon_product_filter,
author = {Iftach Arbel},
title = {Amazon Product Dataset Filtered},
year = {2023},
publisher = {Huggingface},
journal = {Huggingface dataset},
howpublished = {\url{https://huggingface.co/datasets/iarbel/amazon-product-data-filter}},
}
``` |
STEM-AI-mtl/Electrical-engineering | ---
license: other
license_name: stem.ai.mtl
license_link: LICENSE
task_categories:
- question-answering
- text-generation
language:
- en
tags:
- Python
- Kicad
- Electrical engineering
size_categories:
- 1K<n<10K
---
## To the electrical engineering community
This dataset contains Q&A prompts about electrical engineering, Kicad's EDA software features and scripting console Python codes.
## Authors
STEM.AI: stem.ai.mtl@gmail.com\
[William Harbec](https://www.linkedin.com/in/william-harbec-56a262248/) |
distilled-from-one-sec-cv12/chunk_32 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 747696476
num_examples: 145693
download_size: 764524372
dataset_size: 747696476
---
# Dataset Card for "chunk_32"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
google/trueteacher | ---
license: cc-by-nc-4.0
language:
- en
tags:
- natural-language-inference
- news-articles-summarization
---
# **TrueTeacher**
## Dataset Summary
This is a large-scale synthetic dataset for training **Factual Consistency Evaluation** models, introduced in the [TrueTeacher paper (Gekhman et al, 2023)](https://aclanthology.org/2023.emnlp-main.127.pdf).
## Dataset Details
The dataset contains model-generated summaries of articles from the train split of the **CNN/DailyMail** dataset [(Hermann et al., 2015)](https://proceedings.neurips.cc/paper_files/paper/2015/file/afdec7005cc9f14302cd0474fd0f3c96-Paper.pdf)
which are annotated for factual consistency using **FLAN-PaLM 540B** [(Chung et al.,2022)](https://arxiv.org/pdf/2210.11416.pdf).
Summaries were generated using summarization models with different capacities, which were created by fine-tuning **T5** [(Raffel et al., 2020)](https://jmlr.org/papers/volume21/20-074/20-074.pdf) on the **XSum** dataset [(Narayan et al., 2018)](https://aclanthology.org/D18-1206.pdf).
We used the following 5 capacities: T5-11B, T5-3B, T5-large, T5-base and T5-small.
## Data format
The data contains json lines with the following keys:
- `"summarization_model"` - The summarization model used to generate the summary.
- `"cnndm_id"` - The original id from the CNN/DailyMail dataset, this need to be used in order to retrieve the corresponding article from CNN/DailyMail (which was used as the grounding document).
- `"summary"` - The model-generated summary.
- `"label"` - A binary label ('1' - Factualy Consistent, '0' - Factualy Inconsistent).
Here is an example of a single data item:
```json
{
"summarization_model": "T5-11B",
"cnndm_id": "f72048a23154de8699c307e2f41157abbfcae261",
"summary": "Children's brains are being damaged by prolonged internet access, a former children's television presenter has warned."
"label": "1",
}
```
## Loading the dataset
To use the dataset, you need to fetch the relevant documents from the CNN/DailyMail dataset. The follwoing code can be used for that purpose:
```python
from datasets import load_dataset
from tqdm import tqdm
trueteacher_data = load_dataset("google/trueteacher", split='train')
cnn_dailymail_data = load_dataset("cnn_dailymail", version="3.0.0", split='train')
cnn_dailymail_articles_by_id = {example['id']: example['article'] for example in cnn_dailymail_data}
trueteacher_data_with_documents = []
for example in tqdm(trueteacher_data):
example['document'] = cnn_dailymail_articles_by_id[example['cnndm_id']]
trueteacher_data_with_documents.append(example)
```
## Intended Use
This dataset is intended for a research use (**non-commercial**) in English.
The recommended use case is training factual consistency evaluation models for summarization.
## Out-of-scope use
Any use cases which violate the **cc-by-nc-4.0** license.
Usage in languages other than English.
## Citation
If you use this dataset for a research publication, please cite the TrueTeacher paper (using the bibtex entry below), as well as the CNN/DailyMail, XSum, T5 and FLAN papers mentioned above.
```
@misc{gekhman2023trueteacher,
title={TrueTeacher: Learning Factual Consistency Evaluation with Large Language Models},
author={Zorik Gekhman and Jonathan Herzig and Roee Aharoni and Chen Elkind and Idan Szpektor},
year={2023},
eprint={2305.11171},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
fathyshalab/MDCSI_oeffentlicher-verkehr-vermietung | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: label_name
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 183476
num_examples: 337
- name: test
num_bytes: 46611
num_examples: 85
download_size: 132553
dataset_size: 230087
---
# Dataset Card for "reklamation24_oeffentlicher-verkehr-vermietung-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/memphis_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of memphis/メンフィス/孟菲斯 (Azur Lane)
This is the dataset of memphis/メンフィス/孟菲斯 (Azur Lane), containing 42 images and their tags.
The core tags of this character are `breasts, green_eyes, long_hair, pink_hair, large_breasts, bangs, hair_ornament, hairclip`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 42 | 56.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/memphis_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 42 | 35.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/memphis_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 93 | 66.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/memphis_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 42 | 52.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/memphis_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 93 | 92.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/memphis_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/memphis_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, solo, cleavage, looking_at_viewer, bare_shoulders, black_leotard, black_pantyhose, blush, white_background, detached_sleeves, earrings, arm_strap, arm_up, between_breasts, broom_riding, ghost, halloween, hand_on_headwear, high_heel_boots, official_alternate_costume, purple_footwear, purple_headwear, revealing_clothes, sidesaddle, simple_background, skindentation, tattoo, thigh_strap, turret, wide_sleeves, witch_hat, bat_(animal), breastless_clothes, choker, crescent_moon, detached_collar, full_body, full_moon, hand_up, long_sleeves, parted_lips, sparkle |
| 1 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_shirt, blush, black_choker, black_thighhighs, white_background, collared_shirt, school_uniform, bare_shoulders, black_jacket, blue_necktie, blue_skirt, holding, long_sleeves, necklace, off_shoulder, simple_background, sitting, sweater_vest, black_footwear, collarbone, medium_breasts, open_jacket, pink_ribbon, pleated_skirt, sailor_collar, shoes, sleeveless_shirt, smile, zettai_ryouiki |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | looking_at_viewer | bare_shoulders | black_leotard | black_pantyhose | blush | white_background | detached_sleeves | earrings | arm_strap | arm_up | between_breasts | broom_riding | ghost | halloween | hand_on_headwear | high_heel_boots | official_alternate_costume | purple_footwear | purple_headwear | revealing_clothes | sidesaddle | simple_background | skindentation | tattoo | thigh_strap | turret | wide_sleeves | witch_hat | bat_(animal) | breastless_clothes | choker | crescent_moon | detached_collar | full_body | full_moon | hand_up | long_sleeves | parted_lips | sparkle | white_shirt | black_choker | black_thighhighs | collared_shirt | school_uniform | black_jacket | blue_necktie | blue_skirt | holding | necklace | off_shoulder | sitting | sweater_vest | black_footwear | collarbone | medium_breasts | open_jacket | pink_ribbon | pleated_skirt | sailor_collar | shoes | sleeveless_shirt | smile | zettai_ryouiki |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------------------|:-----------------|:----------------|:------------------|:--------|:-------------------|:-------------------|:-----------|:------------|:---------|:------------------|:---------------|:--------|:------------|:-------------------|:------------------|:-----------------------------|:------------------|:------------------|:--------------------|:-------------|:--------------------|:----------------|:---------|:--------------|:---------|:---------------|:------------|:---------------|:---------------------|:---------|:----------------|:------------------|:------------|:------------|:----------|:---------------|:--------------|:----------|:--------------|:---------------|:-------------------|:-----------------|:-----------------|:---------------|:---------------|:-------------|:----------|:-----------|:---------------|:----------|:---------------|:-----------------|:-------------|:-----------------|:--------------|:--------------|:----------------|:----------------|:--------|:-------------------|:--------|:-----------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | | X | X | | | X | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
TempoFunk/big | ---
license: agpl-3.0
---
|
bzantium/LongPerplexity | ---
license: apache-2.0
configs:
- config_name: c4
data_files:
- split: test
path: c4.jsonl
- config_name: arxiv
data_files:
- split: test
path: arxiv.jsonl
- config_name: github
data_files:
- split: test
path: github.jsonl
language:
- en
tags:
- ppl
--- |
War455da/Testone | ---
license: mit
task_categories:
- text-classification
- question-answering
- table-question-answering
- conversational
- feature-extraction
- text2text-generation
language:
- en
tags:
- code
--- |
liuyanchen1015/MULTI_VALUE_stsb_drop_aux_be_progressive | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 48312
num_examples: 424
- name: test
num_bytes: 46099
num_examples: 433
- name: train
num_bytes: 137239
num_examples: 1298
download_size: 123922
dataset_size: 231650
---
# Dataset Card for "MULTI_VALUE_stsb_drop_aux_be_progressive"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/VALUE_mrpc_negative_inversion | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: train
num_bytes: 1353
num_examples: 5
download_size: 4790
dataset_size: 1353
---
# Dataset Card for "VALUE_mrpc_negative_inversion"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
skrishna/ruin_names_preprocessed | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
sequence: string
- name: multiple_choice_targets
sequence: string
- name: multiple_choice_scores
sequence: int32
- name: idx
dtype: int32
splits:
- name: train
num_bytes: 111297
num_examples: 359
- name: validation
num_bytes: 27924
num_examples: 89
download_size: 55059
dataset_size: 139221
---
# Dataset Card for "ruin_names_preprocessed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_BFauber__opt125m_10e5_10ep | ---
pretty_name: Evaluation run of BFauber/opt125m_10e5_10ep
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/opt125m_10e5_10ep](https://huggingface.co/BFauber/opt125m_10e5_10ep)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e5_10ep\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T19:26:25.551220](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_10ep/blob/main/results_2024-02-02T19-26-25.551220.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24846725146327228,\n\
\ \"acc_stderr\": 0.030376252466474542,\n \"acc_norm\": 0.24882164259207676,\n\
\ \"acc_norm_stderr\": 0.031176472316746258,\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826837,\n \"mc2\": 0.4621982015682342,\n\
\ \"mc2_stderr\": 0.01527762977208882\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.21928327645051193,\n \"acc_stderr\": 0.01209124578761574,\n\
\ \"acc_norm\": 0.23976109215017063,\n \"acc_norm_stderr\": 0.012476304127453958\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2863971320454093,\n\
\ \"acc_stderr\": 0.004511533039406228,\n \"acc_norm\": 0.3123879705238,\n\
\ \"acc_norm_stderr\": 0.004625198756710251\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066652,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066652\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.17,\n\
\ \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \
\ \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102967,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102967\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.12698412698412698,\n\
\ \"acc_stderr\": 0.029780417522688424,\n \"acc_norm\": 0.12698412698412698,\n\
\ \"acc_norm_stderr\": 0.029780417522688424\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3096774193548387,\n\
\ \"acc_stderr\": 0.026302774983517418,\n \"acc_norm\": 0.3096774193548387,\n\
\ \"acc_norm_stderr\": 0.026302774983517418\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.15763546798029557,\n \"acc_stderr\": 0.025639014131172404,\n\
\ \"acc_norm\": 0.15763546798029557,\n \"acc_norm_stderr\": 0.025639014131172404\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20707070707070707,\n \"acc_stderr\": 0.028869778460267052,\n \"\
acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.028869778460267052\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24870466321243523,\n \"acc_stderr\": 0.031195840877700293,\n\
\ \"acc_norm\": 0.24870466321243523,\n \"acc_norm_stderr\": 0.031195840877700293\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2358974358974359,\n \"acc_stderr\": 0.021525965407408726,\n\
\ \"acc_norm\": 0.2358974358974359,\n \"acc_norm_stderr\": 0.021525965407408726\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671549,\n\
\ \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671549\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473835,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473835\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24587155963302754,\n \"acc_stderr\": 0.01846194096870845,\n \"\
acc_norm\": 0.24587155963302754,\n \"acc_norm_stderr\": 0.01846194096870845\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.36771300448430494,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.36771300448430494,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18803418803418803,\n\
\ \"acc_stderr\": 0.02559819368665226,\n \"acc_norm\": 0.18803418803418803,\n\
\ \"acc_norm_stderr\": 0.02559819368665226\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26053639846743293,\n\
\ \"acc_stderr\": 0.015696008563807092,\n \"acc_norm\": 0.26053639846743293,\n\
\ \"acc_norm_stderr\": 0.015696008563807092\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023805186524888156,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023805186524888156\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19935691318327975,\n\
\ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.19935691318327975,\n\
\ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.02324620264781975,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.02324620264781975\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880596,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880596\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.21224489795918366,\n \"acc_stderr\": 0.026176967197866767,\n\
\ \"acc_norm\": 0.21224489795918366,\n \"acc_norm_stderr\": 0.026176967197866767\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n\
\ \"acc_stderr\": 0.034843315926805875,\n \"acc_norm\": 0.27710843373493976,\n\
\ \"acc_norm_stderr\": 0.034843315926805875\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.25146198830409355,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.25146198830409355,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.015225899340826837,\n \"mc2\": 0.4621982015682342,\n\
\ \"mc2_stderr\": 0.01527762977208882\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5224940805051302,\n \"acc_stderr\": 0.014038257824059883\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/opt125m_10e5_10ep
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|arc:challenge|25_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|gsm8k|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hellaswag|10_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-26-25.551220.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T19-26-25.551220.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- '**/details_harness|winogrande|5_2024-02-02T19-26-25.551220.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T19-26-25.551220.parquet'
- config_name: results
data_files:
- split: 2024_02_02T19_26_25.551220
path:
- results_2024-02-02T19-26-25.551220.parquet
- split: latest
path:
- results_2024-02-02T19-26-25.551220.parquet
---
# Dataset Card for Evaluation run of BFauber/opt125m_10e5_10ep
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5_10ep](https://huggingface.co/BFauber/opt125m_10e5_10ep) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e5_10ep",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T19:26:25.551220](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_10ep/blob/main/results_2024-02-02T19-26-25.551220.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24846725146327228,
"acc_stderr": 0.030376252466474542,
"acc_norm": 0.24882164259207676,
"acc_norm_stderr": 0.031176472316746258,
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826837,
"mc2": 0.4621982015682342,
"mc2_stderr": 0.01527762977208882
},
"harness|arc:challenge|25": {
"acc": 0.21928327645051193,
"acc_stderr": 0.01209124578761574,
"acc_norm": 0.23976109215017063,
"acc_norm_stderr": 0.012476304127453958
},
"harness|hellaswag|10": {
"acc": 0.2863971320454093,
"acc_stderr": 0.004511533039406228,
"acc_norm": 0.3123879705238,
"acc_norm_stderr": 0.004625198756710251
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066652,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066652
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102967,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102967
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.12698412698412698,
"acc_stderr": 0.029780417522688424,
"acc_norm": 0.12698412698412698,
"acc_norm_stderr": 0.029780417522688424
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3096774193548387,
"acc_stderr": 0.026302774983517418,
"acc_norm": 0.3096774193548387,
"acc_norm_stderr": 0.026302774983517418
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15763546798029557,
"acc_stderr": 0.025639014131172404,
"acc_norm": 0.15763546798029557,
"acc_norm_stderr": 0.025639014131172404
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20707070707070707,
"acc_stderr": 0.028869778460267052,
"acc_norm": 0.20707070707070707,
"acc_norm_stderr": 0.028869778460267052
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24870466321243523,
"acc_stderr": 0.031195840877700293,
"acc_norm": 0.24870466321243523,
"acc_norm_stderr": 0.031195840877700293
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2358974358974359,
"acc_stderr": 0.021525965407408726,
"acc_norm": 0.2358974358974359,
"acc_norm_stderr": 0.021525965407408726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02665353159671549,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02665353159671549
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473835,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473835
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24587155963302754,
"acc_stderr": 0.01846194096870845,
"acc_norm": 0.24587155963302754,
"acc_norm_stderr": 0.01846194096870845
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.36771300448430494,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.36771300448430494,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18803418803418803,
"acc_stderr": 0.02559819368665226,
"acc_norm": 0.18803418803418803,
"acc_norm_stderr": 0.02559819368665226
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26053639846743293,
"acc_stderr": 0.015696008563807092,
"acc_norm": 0.26053639846743293,
"acc_norm_stderr": 0.015696008563807092
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023805186524888156,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023805186524888156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19935691318327975,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.19935691318327975,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880596,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880596
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.21224489795918366,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.21224489795918366,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.034843315926805875,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.034843315926805875
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.25146198830409355,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.25146198830409355,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826837,
"mc2": 0.4621982015682342,
"mc2_stderr": 0.01527762977208882
},
"harness|winogrande|5": {
"acc": 0.5224940805051302,
"acc_stderr": 0.014038257824059883
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
infCapital/WizardLM_Orca_vi | ---
license: mit
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 140945974
num_examples: 52507
download_size: 58938956
dataset_size: 140945974
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Explain tuned WizardLM dataset ~55K created using approaches from Orca Research Paper.
We leverage all of the 15 system instructions provided in Orca Research Paper. to generate custom datasets, in contrast to vanilla instruction tuning approaches used by original datasets.
This helps student models like orca_mini_13b to learn thought process from teacher model, which is ChatGPT (gpt-3.5-turbo version). |
Sleoruiz/disc_cla_cuarta-2 | ---
dataset_info:
features:
- name: text
dtype: 'null'
- name: inputs
struct:
- name: comision
dtype: string
- name: fecha_gaceta
dtype: string
- name: gaceta_numero
dtype: string
- name: name
dtype: string
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
sequence: string
- name: annotation_agent
dtype: string
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 8052385
num_examples: 3349
download_size: 4065041
dataset_size: 8052385
---
# Dataset Card for "disc_cla_-cuarta-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_KoboldAI__OPT-13B-Erebus | ---
pretty_name: Evaluation run of KoboldAI/OPT-13B-Erebus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KoboldAI/OPT-13B-Erebus](https://huggingface.co/KoboldAI/OPT-13B-Erebus) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KoboldAI__OPT-13B-Erebus\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T19:32:11.305673](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__OPT-13B-Erebus/blob/main/results_2023-09-22T19-32-11.305673.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n\
\ \"em_stderr\": 0.00027736144573356107,\n \"f1\": 0.05225776006711417,\n\
\ \"f1_stderr\": 0.001244995094102496,\n \"acc\": 0.33646636224974913,\n\
\ \"acc_stderr\": 0.007825552570817792\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.00027736144573356107,\n\
\ \"f1\": 0.05225776006711417,\n \"f1_stderr\": 0.001244995094102496\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \
\ \"acc_stderr\": 0.0023892815120772175\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.665351223362273,\n \"acc_stderr\": 0.013261823629558368\n\
\ }\n}\n```"
repo_url: https://huggingface.co/KoboldAI/OPT-13B-Erebus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T19_32_11.305673
path:
- '**/details_harness|drop|3_2023-09-22T19-32-11.305673.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T19-32-11.305673.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T19_32_11.305673
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-32-11.305673.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-32-11.305673.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:18:46.837610.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:18:46.837610.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:18:46.837610.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T19_32_11.305673
path:
- '**/details_harness|winogrande|5_2023-09-22T19-32-11.305673.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T19-32-11.305673.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_18_46.837610
path:
- results_2023-07-19T18:18:46.837610.parquet
- split: 2023_09_22T19_32_11.305673
path:
- results_2023-09-22T19-32-11.305673.parquet
- split: latest
path:
- results_2023-09-22T19-32-11.305673.parquet
---
# Dataset Card for Evaluation run of KoboldAI/OPT-13B-Erebus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KoboldAI/OPT-13B-Erebus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KoboldAI/OPT-13B-Erebus](https://huggingface.co/KoboldAI/OPT-13B-Erebus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KoboldAI__OPT-13B-Erebus",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T19:32:11.305673](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__OPT-13B-Erebus/blob/main/results_2023-09-22T19-32-11.305673.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0007340604026845638,
"em_stderr": 0.00027736144573356107,
"f1": 0.05225776006711417,
"f1_stderr": 0.001244995094102496,
"acc": 0.33646636224974913,
"acc_stderr": 0.007825552570817792
},
"harness|drop|3": {
"em": 0.0007340604026845638,
"em_stderr": 0.00027736144573356107,
"f1": 0.05225776006711417,
"f1_stderr": 0.001244995094102496
},
"harness|gsm8k|5": {
"acc": 0.0075815011372251705,
"acc_stderr": 0.0023892815120772175
},
"harness|winogrande|5": {
"acc": 0.665351223362273,
"acc_stderr": 0.013261823629558368
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tyzhu/squad_qa_wrong_title_v5_full_recite_ans_sent_random_permute_rerun_1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 4631327.158273381
num_examples: 2875
- name: validation
num_bytes: 422069
num_examples: 300
download_size: 1393400
dataset_size: 5053396.158273381
---
# Dataset Card for "squad_qa_wrong_title_v5_full_recite_ans_sent_random_permute_rerun_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraLM/partitioned_v2_standardized_12 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
splits:
- name: train
num_bytes: 27471620.419495568
num_examples: 57255
download_size: 22536165
dataset_size: 27471620.419495568
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v2_standardized_12"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/mordred_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mordred/モードレッド/莫德雷德 (Fate/Grand Order)
This is the dataset of mordred/モードレッド/莫德雷德 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `blonde_hair, ponytail, long_hair, green_eyes, scrunchie, braid, red_scrunchie, hair_ornament, hair_scrunchie, breasts, french_braid, small_breasts, parted_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 761.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mordred_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 663.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mordred_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1254 | 1.27 GiB | [Download](https://huggingface.co/datasets/CyberHarem/mordred_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mordred_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, gauntlets, looking_at_viewer, pauldrons, solo, smile, breastplate, holding_sword |
| 1 | 7 |  |  |  |  |  | 1girl, gauntlets, holding_sword, open_mouth, solo, breastplate, pauldrons, teeth, upper_body |
| 2 | 7 |  |  |  |  |  | 1girl, gauntlets, holding_sword, looking_at_viewer, solo, shoulder_armor, grin, teeth, breastplate, upper_body, red_background |
| 3 | 7 |  |  |  |  |  | 1girl, detached_sleeves, looking_at_viewer, solo, holding_sword, navel, simple_background, smile, thighhighs, white_background, bare_shoulders, full_body, midriff, underboob, red_footwear, standing |
| 4 | 13 |  |  |  |  |  | 1girl, belt, denim_shorts, midriff, navel, necklace, solo, bandeau, cutoffs, looking_at_viewer, open_jacket, red_jacket, long_sleeves, short_shorts, collarbone, simple_background, stomach, cowboy_shot, white_background, holding_sword, cleavage, grin, sidelocks |
| 5 | 45 |  |  |  |  |  | 1girl, formal, solo, suit, looking_at_viewer, smile, white_shirt, black_jacket, black_necktie, white_gloves, collared_shirt, long_sleeves, pants, vest, flower, simple_background |
| 6 | 19 |  |  |  |  |  | 1girl, solo, red_bikini, side-tie_bikini_bottom, outdoors, navel, string_bikini, day, blue_sky, looking_at_viewer, smile, cloud, halterneck, blush, collarbone, medium_breasts, open_mouth, bare_shoulders, front-tie_bikini_top, holding_surfboard, ocean, beach |
| 7 | 12 |  |  |  |  |  | 1girl, sidelocks, solo, bare_shoulders, looking_at_viewer, thighs, collarbone, navel, red_panties, blush, red_bra, underwear_only |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | gauntlets | looking_at_viewer | pauldrons | solo | smile | breastplate | holding_sword | open_mouth | teeth | upper_body | shoulder_armor | grin | red_background | detached_sleeves | navel | simple_background | thighhighs | white_background | bare_shoulders | full_body | midriff | underboob | red_footwear | standing | belt | denim_shorts | necklace | bandeau | cutoffs | open_jacket | red_jacket | long_sleeves | short_shorts | collarbone | stomach | cowboy_shot | cleavage | sidelocks | formal | suit | white_shirt | black_jacket | black_necktie | white_gloves | collared_shirt | pants | vest | flower | red_bikini | side-tie_bikini_bottom | outdoors | string_bikini | day | blue_sky | cloud | halterneck | blush | medium_breasts | front-tie_bikini_top | holding_surfboard | ocean | beach | thighs | red_panties | red_bra | underwear_only |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:--------------------|:------------|:-------|:--------|:--------------|:----------------|:-------------|:--------|:-------------|:-----------------|:-------|:-----------------|:-------------------|:--------|:--------------------|:-------------|:-------------------|:-----------------|:------------|:----------|:------------|:---------------|:-----------|:-------|:---------------|:-----------|:----------|:----------|:--------------|:-------------|:---------------|:---------------|:-------------|:----------|:--------------|:-----------|:------------|:---------|:-------|:--------------|:---------------|:----------------|:---------------|:-----------------|:--------|:-------|:---------|:-------------|:-------------------------|:-----------|:----------------|:------|:-----------|:--------|:-------------|:--------|:-----------------|:-----------------------|:--------------------|:--------|:--------|:---------|:--------------|:----------|:-----------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | | X | | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | X | | X | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 13 |  |  |  |  |  | X | | X | | X | | | X | | | | | X | | | X | X | | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 45 |  |  |  |  |  | X | | X | | X | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 6 | 19 |  |  |  |  |  | X | | X | | X | X | | | X | | | | | | | X | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 7 | 12 |  |  |  |  |  | X | | X | | X | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X |
|
chiennv/ultrachat-50k | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 477810379
num_examples: 50000
download_size: 0
dataset_size: 477810379
---
# Dataset Card for "ultrachat-50k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Charly9000/forums | ---
license: mit
---
|
autoevaluate/autoeval-eval-multi_news-default-e22c67-2252871794 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- multi_news
eval_info:
task: summarization
model: pszemraj/led-large-book-summary
metrics: []
dataset_name: multi_news
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/led-large-book-summary
* Dataset: multi_news
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
davidiftime/instructify | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 73895710
num_examples: 142622
download_size: 38839398
dataset_size: 73895710
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mytoon/niji-0802 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1966037219.914
num_examples: 1038
download_size: 1967850762
dataset_size: 1966037219.914
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "niji-0802"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
denocris/guanaco-openassistant-llama2-2k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3045282
num_examples: 2000
download_size: 1809448
dataset_size: 3045282
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mstz/ozone | ---
language:
- en
tags:
- ozone
- tabular_classification
- binary_classification
pretty_name: Ozone
size_categories:
- 1K<n<10K
task_categories:
- tabular-classification
configs:
- 8hr
- 1hr
license: cc
---
# Ozone
The [Ozone dataset](https://archive.ics.uci.edu/ml/datasets/Ozone) from the [UCI ML repository](https://archive.ics.uci.edu/ml/datasets).
# Configurations and tasks
| **Configuration** | **Task** | **Description** |
|-------------------|---------------------------|-------------------------|
| 8hr | Binary classification | Is there an ozone layer?|
| 1hr | Binary classification | Is there an ozone layer?|
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/ozone", "8hr")["train"]
``` |
ibranze/araproje_truthful_tr | ---
dataset_info:
features:
- name: question
dtype: string
- name: mc1_targets
struct:
- name: choices
sequence: string
- name: labels
sequence: int32
- name: mc2_targets
struct:
- name: choices
sequence: string
- name: labels
sequence: int32
splits:
- name: validation
num_bytes: 204710
num_examples: 250
download_size: 97922
dataset_size: 204710
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_truthful_tr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tensoic/airoboros-3.2_kn | ---
license: cc-by-4.0
task_categories:
- text-generation
language:
- kn
---
Kannada translation of jondurbin/airoboros-3.2 |
ideepankarsharma2003/Midjourney_v6_Classification_small_shuffled | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': ai_gen
'1': human
splits:
- name: train
num_bytes: 88207741301.0
num_examples: 36000
- name: validation
num_bytes: 1058780003.0
num_examples: 464
- name: test
num_bytes: 4591912204.0
num_examples: 2000
download_size: 85325608160
dataset_size: 93858433508.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Simonlob/Kany_dataset_mk4 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: raw_transcription
dtype: string
- name: transcription
dtype: string
- name: sentence_type
dtype: string
- name: speaker_id
dtype: string
- name: gender
dtype: int64
- name: audio
struct:
- name: array
sequence: float32
- name: path
dtype: string
- name: sampling_rate
dtype: int64
splits:
- name: train
num_bytes: 8364217743.56407
num_examples: 7016
- name: test
num_bytes: 36957062.435930185
num_examples: 31
download_size: 3712380863
dataset_size: 8401174806.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
liuyanchen1015/MULTI_VALUE_mnli_were_was | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 144918
num_examples: 642
- name: dev_mismatched
num_bytes: 199131
num_examples: 873
- name: test_matched
num_bytes: 142000
num_examples: 625
- name: test_mismatched
num_bytes: 193484
num_examples: 843
- name: train
num_bytes: 5853995
num_examples: 24981
download_size: 4012329
dataset_size: 6533528
---
# Dataset Card for "MULTI_VALUE_mnli_were_was"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/multi_language_conversation | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
task_categories:
- conversational
---
# Dataset Card for multi_language_conversation
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://nexdata.ai/?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
The dataset contains 12,000 hours of multi-language conversation speech data. It's recorded by native speakers, covering English, French, German, Russian, Spanish, Japanese, Korean, Hindi, Vietnamese etc. The speakers start the conversation around a familar topic, to ensure the smoothness and nature of the conversation. The format is 16kHz, 16bit, uncompressed wav, mono channel. The sentence accuracy is over 95%.
For more details, please refer to the link: https://nexdata.ai/speechRecognition?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
English, French, German, Russian, Spanish, Japanese, Korean, Hindi, Vietnamese etc.
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commercial License
### Citation Information
[More Information Needed]
### Contributions |
kamel-usp/aes_enem_dataset | ---
license: apache-2.0
task_categories:
- text-classification
language:
- pt
tags:
- education
- aes
- enem
size_categories:
- n<1K
---
# Automated Essay Score (AES) ENEM Dataset
## Dataset Description
- **Purpose**: Automated Essay Scoring
- **Contents**: Student Essay Grades
- **Source**: https://github.com/kamel-usp/aes_enem
- **Size**: N<1000
## Use Case and Creators
- **Intended Use**: Estimate Essay Score
- **Creators**: Igor Cataneo Silveira, André Barbosa and Denis Deratani Mauá
- **Contact Information**: igorcs@ime.usp.br; andre.barbosa@ime.usp.br
## Licensing Information
- **License**: MIT License
## Citation Details
- **Preferred Citation**:
```
@proceedings{DBLP:conf/propor/2024,
editor = {Igor Cataneo Silveira, André Barbosa and Denis Deratani Mauá},
title = {Computational Processing of the Portuguese Language - 16th International
Conference, {PROPOR} 2024, Galiza, March 13-15, 2024, Proceedings},
series = {Lecture Notes in Computer Science},
volume = {TODO},
publisher = {Springer},
year = {2024},
url = {TODO},
doi = {TODO},
isbn = {TODO},
timestamp = {TODO},
biburl = {TODO},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
## Data Structure
- **Features**:
- id: id of scraped page. `id_prompt`+`id` should be unique
- id_prompt: Essay's theme
- essay_title: Essay title
- essay_text: Essay text
- grades: list of 6 elements containing the grade for each of the five concepts plus the sum of all grades
- essay_year: Essay's year
- **Number of Instances**:
- sourceAOnly:
- train: 227
- validation: 68
- test: 90
- sourceAWithGraders:
- train: 744
- validation: 195
- test: 216
- sourceB:
- full: 3219
- **Data Splits**:
- sourceAOnly: sourceA data
- sourceAWithGraders: sourceA data augmented with Grader's review. In a nutshell, each row becomes three (the original grade plus two graders result)
- sourceB: sourceB data
## Data Considerations
- **Known Limitations**:
- **Ethical Considerations**:
## Additional Information
- **Additional Links**: Main code is [here](https://github.com/kamel-usp/aes_enem)
- **Related Datasets**: https://github.com/evelinamorim/aes-pt |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.