datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
youyu0105/llm-MIDI4 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 570535
num_examples: 335
download_size: 131987
dataset_size: 570535
---
# Dataset Card for "llm-MIDI4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/gr_mg36_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of gr_mg36/GrMG36/MG36 (Girls' Frontline)
This is the dataset of gr_mg36/GrMG36/MG36 (Girls' Frontline), containing 26 images and their tags.
The core tags of this character are `blue_eyes, blonde_hair, hair_ornament, bangs, long_hair, hair_over_one_eye, hairclip, breasts, ahoge, heterochromia, yellow_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 26 | 32.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg36_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 26 | 16.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg36_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 51 | 30.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg36_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 26 | 26.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg36_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 51 | 45.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg36_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gr_mg36_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 26 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_gloves, closed_mouth, collarbone, white_background, bare_shoulders, fingerless_gloves, simple_background, socks, weapon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | black_gloves | closed_mouth | collarbone | white_background | bare_shoulders | fingerless_gloves | simple_background | socks | weapon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:---------------|:-------------|:-------------------|:-----------------|:--------------------|:--------------------|:--------|:---------|
| 0 | 26 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X |
|
japanese-asr/whisper_transcriptions.reazonspeech.all_17 | ---
dataset_info:
config_name: all
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 30517145303.0
num_examples: 268223
download_size: 30279684051
dataset_size: 30517145303.0
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
---
|
poorguys/TW-Kai_2_aoyagireisyosimo2_all_512 | ---
dataset_info:
features:
- name: char
dtype: string
- name: unicode
dtype: string
- name: images
dtype: image
- name: target_images
dtype: image
- name: stroke
dtype: int32
- name: strokes_sequence
sequence: int32
- name: components
sequence: int32
- name: jyutping
dtype: string
splits:
- name: train
num_bytes: 438261774.25
num_examples: 6351
- name: test
num_bytes: 2409980857.25
num_examples: 69931
download_size: 1959823420
dataset_size: 2848242631.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Multimodal-Fatima/Caltech101_not_background_test_facebook_opt_1.3b_Visclues_ns_5647 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_16
num_bytes: 84811910.125
num_examples: 5647
- name: fewshot_1_bs_16
num_bytes: 86719029.125
num_examples: 5647
- name: fewshot_3_bs_16
num_bytes: 90542558.125
num_examples: 5647
- name: fewshot_5_bs_16
num_bytes: 94354619.125
num_examples: 5647
- name: fewshot_8_bs_16
num_bytes: 100058064.125
num_examples: 5647
download_size: 418819193
dataset_size: 456486180.625
---
# Dataset Card for "Caltech101_not_background_test_facebook_opt_1.3b_Visclues_ns_5647"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CausalLM__7B | ---
pretty_name: Evaluation run of CausalLM/7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CausalLM/7B](https://huggingface.co/CausalLM/7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CausalLM__7B_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-19T10:15:27.073071](https://huggingface.co/datasets/open-llm-leaderboard/details_CausalLM__7B_public/blob/main/results_2023-11-19T10-15-27.073071.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6094831324044202,\n\
\ \"acc_stderr\": 0.0327856640395233,\n \"acc_norm\": 0.6180866854509012,\n\
\ \"acc_norm_stderr\": 0.03347186592408746,\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5012670346064317,\n\
\ \"mc2_stderr\": 0.015282424019072406,\n \"em\": 0.3381921140939597,\n\
\ \"em_stderr\": 0.0048449283464877275,\n \"f1\": 0.4114880453020153,\n\
\ \"f1_stderr\": 0.00471092648573539\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47013651877133106,\n \"acc_stderr\": 0.014585305840007102,\n\
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.014611390804670088\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5603465445130452,\n\
\ \"acc_stderr\": 0.004953305461311753,\n \"acc_norm\": 0.7457677753435571,\n\
\ \"acc_norm_stderr\": 0.00434538861452003\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7167630057803468,\n\
\ \"acc_stderr\": 0.034355680560478746,\n \"acc_norm\": 0.7167630057803468,\n\
\ \"acc_norm_stderr\": 0.034355680560478746\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.046151869625837026,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.046151869625837026\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382175,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382175\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.027479603010538808,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.027479603010538808\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306443,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.024864995159767755,\n\
\ \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.024864995159767755\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.031811100324139266,\n\
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.031811100324139266\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.01639943636661291,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.01639943636661291\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381387,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381387\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6560693641618497,\n \"acc_stderr\": 0.025574123786546655,\n\
\ \"acc_norm\": 0.6560693641618497,\n \"acc_norm_stderr\": 0.025574123786546655\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n\
\ \"acc_stderr\": 0.01495010300247536,\n \"acc_norm\": 0.2759776536312849,\n\
\ \"acc_norm_stderr\": 0.01495010300247536\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906497,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906497\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.02638527370346449,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.02638527370346449\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603746,\n\
\ \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4954367666232073,\n\
\ \"acc_stderr\": 0.012769704263117526,\n \"acc_norm\": 0.4954367666232073,\n\
\ \"acc_norm_stderr\": 0.012769704263117526\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n\
\ \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6160130718954249,\n \"acc_stderr\": 0.01967580813528152,\n \
\ \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.01967580813528152\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712845,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712845\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482705,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482705\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5012670346064317,\n\
\ \"mc2_stderr\": 0.015282424019072406\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.696921862667719,\n \"acc_stderr\": 0.012916727462634458\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.3381921140939597,\n \
\ \"em_stderr\": 0.0048449283464877275,\n \"f1\": 0.4114880453020153,\n\
\ \"f1_stderr\": 0.00471092648573539\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.22971948445792267,\n \"acc_stderr\": 0.011586857544997503\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CausalLM/7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|arc:challenge|25_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|drop|3_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|gsm8k|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hellaswag|10_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T10-15-27.073071.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-19T10-15-27.073071.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- '**/details_harness|winogrande|5_2023-11-19T10-15-27.073071.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-19T10-15-27.073071.parquet'
- config_name: results
data_files:
- split: 2023_11_19T10_15_27.073071
path:
- results_2023-11-19T10-15-27.073071.parquet
- split: latest
path:
- results_2023-11-19T10-15-27.073071.parquet
---
# Dataset Card for Evaluation run of CausalLM/7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CausalLM/7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CausalLM/7B](https://huggingface.co/CausalLM/7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CausalLM__7B_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-19T10:15:27.073071](https://huggingface.co/datasets/open-llm-leaderboard/details_CausalLM__7B_public/blob/main/results_2023-11-19T10-15-27.073071.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6094831324044202,
"acc_stderr": 0.0327856640395233,
"acc_norm": 0.6180866854509012,
"acc_norm_stderr": 0.03347186592408746,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5012670346064317,
"mc2_stderr": 0.015282424019072406,
"em": 0.3381921140939597,
"em_stderr": 0.0048449283464877275,
"f1": 0.4114880453020153,
"f1_stderr": 0.00471092648573539
},
"harness|arc:challenge|25": {
"acc": 0.47013651877133106,
"acc_stderr": 0.014585305840007102,
"acc_norm": 0.5,
"acc_norm_stderr": 0.014611390804670088
},
"harness|hellaswag|10": {
"acc": 0.5603465445130452,
"acc_stderr": 0.004953305461311753,
"acc_norm": 0.7457677753435571,
"acc_norm_stderr": 0.00434538861452003
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.034355680560478746,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.034355680560478746
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382175,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382175
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.027479603010538808,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.027479603010538808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306443,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5974358974358974,
"acc_stderr": 0.024864995159767755,
"acc_norm": 0.5974358974358974,
"acc_norm_stderr": 0.024864995159767755
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.031811100324139266,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.031811100324139266
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.01639943636661291,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.01639943636661291
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381387,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381387
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.025574123786546655,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.025574123786546655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.01495010300247536,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.01495010300247536
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906497,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906497
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.02638527370346449,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.02638527370346449
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603746,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4954367666232073,
"acc_stderr": 0.012769704263117526,
"acc_norm": 0.4954367666232073,
"acc_norm_stderr": 0.012769704263117526
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.01967580813528152,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.01967580813528152
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712845,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712845
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482705,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482705
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5012670346064317,
"mc2_stderr": 0.015282424019072406
},
"harness|winogrande|5": {
"acc": 0.696921862667719,
"acc_stderr": 0.012916727462634458
},
"harness|drop|3": {
"em": 0.3381921140939597,
"em_stderr": 0.0048449283464877275,
"f1": 0.4114880453020153,
"f1_stderr": 0.00471092648573539
},
"harness|gsm8k|5": {
"acc": 0.22971948445792267,
"acc_stderr": 0.011586857544997503
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ewhfef/mix_cpt | ---
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 426850969
num_examples: 29635
download_size: 195835115
dataset_size: 426850969
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
A mix of part of datasets: pubmed, pubmed_qa and alpaca |
Otherwa/GenAi-Public-Response | ---
license: openrail
language:
- en
tags:
- code
- legal
- finance
- biology
- chemistry
- music
- art
- medical
- climate
size_categories:
- n<1K
--- |
tyzhu/squad_qa_no_id_v5_full_recite_ans_sent_no_permute_rerun | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7992127.884496851
num_examples: 4778
- name: validation
num_bytes: 402971
num_examples: 300
download_size: 1428626
dataset_size: 8395098.88449685
---
# Dataset Card for "squad_qa_no_id_v5_full_recite_ans_sent_no_permute_rerun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/car_v1.5_trec-y1_manual | ---
pretty_name: '`car/v1.5/trec-y1/manual`'
viewer: false
source_datasets: ['irds/car_v1.5']
task_categories:
- text-retrieval
---
# Dataset Card for `car/v1.5/trec-y1/manual`
The `car/v1.5/trec-y1/manual` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/car#car/v1.5/trec-y1/manual).
# Data
This dataset provides:
- `qrels`: (relevance assessments); count=29,571
- For `docs`, use [`irds/car_v1.5`](https://huggingface.co/datasets/irds/car_v1.5)
## Usage
```python
from datasets import load_dataset
qrels = load_dataset('irds/car_v1.5_trec-y1_manual', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Dietz2017TrecCar,
title={TREC Complex Answer Retrieval Overview.},
author={Dietz, Laura and Verma, Manisha and Radlinski, Filip and Craswell, Nick},
booktitle={TREC},
year={2017}
}
@article{Dietz2017Car,
title={{TREC CAR}: A Data Set for Complex Answer Retrieval},
author={Laura Dietz and Ben Gamari},
year={2017},
note={Version 1.5},
url={http://trec-car.cs.unh.edu}
}
```
|
vigneshgs7/Boundary_detection_Doc_2 | ---
dataset_info:
features:
- name: name
dtype: string
- name: uuid
dtype: string
- name: status
dtype: string
- name: image
dtype: image
- name: label.annotations
list:
- name: id
dtype: int32
- name: category_id
dtype: int32
- name: label.segmentation_bitmap
dtype: image
splits:
- name: train
num_bytes: 4375553760.0
num_examples: 88
download_size: 286343850
dataset_size: 4375553760.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
qubitbucket/embeddings-tutorial | ---
license: apache-2.0
---
|
Nxrd/michaoficial | ---
license: openrail
---
|
fairlabs/aihub-nmt-dataset-2022-07 | ---
dataset_info:
features:
- name: en
dtype: string
- name: ko
dtype: string
splits:
- name: train
num_bytes: 447177389
num_examples: 1200144
- name: validation
num_bytes: 44752356.769347675
num_examples: 120018
- name: test
num_bytes: 11186411.230652321
num_examples: 30000
download_size: 326090070
dataset_size: 503116157.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
warzin/covers | ---
license: other
license_name: seila
license_link: LICENSE
---
|
sushvij/generativeaisample | ---
license: openrail
language:
- en
pretty_name: gai
--- |
heliosprime/twitter_dataset_1713203122 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 28747
num_examples: 78
download_size: 23120
dataset_size: 28747
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713203122"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nakkhatra/trial_bn | ---
license: cc0-1.0
---
|
ylacombe/librispeech_asr_tags | ---
dataset_info:
- config_name: clean
features:
- name: file
dtype: string
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
- name: utterance_pitch_mean
dtype: float32
- name: utterance_pitch_std
dtype: float32
- name: snr
dtype: float64
- name: c50
dtype: float64
- name: speaking_rate
dtype: float64
- name: phonemes
dtype: string
- name: gender
dtype: string
splits:
- name: train.100
num_bytes: 17998991
num_examples: 28539
- name: train.360
num_bytes: 65429327
num_examples: 104014
- name: validation
num_bytes: 1238969
num_examples: 2703
- name: test
num_bytes: 1205066
num_examples: 2620
download_size: 40197691
dataset_size: 85872353
- config_name: other
features:
- name: file
dtype: string
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
- name: utterance_pitch_mean
dtype: float32
- name: utterance_pitch_std
dtype: float32
- name: snr
dtype: float64
- name: c50
dtype: float64
- name: speaking_rate
dtype: float64
- name: phonemes
dtype: string
splits:
- name: train.500
num_bytes: 87768115
num_examples: 148688
- name: validation
num_bytes: 1196395
num_examples: 2864
- name: test
num_bytes: 1228421
num_examples: 2939
download_size: 42452591
dataset_size: 90192931
configs:
- config_name: clean
data_files:
- split: train.100
path: clean/train.100-*
- split: train.360
path: clean/train.360-*
- split: validation
path: clean/validation-*
- split: test
path: clean/test-*
- config_name: other
data_files:
- split: train.500
path: other/train.500-*
- split: validation
path: other/validation-*
- split: test
path: other/test-*
---
|
Prince3069/Speedbolt | ---
license: apache-2.0
---
|
alpayariyak/SkunkData-Corpus-Clusters | ---
configs:
- config_name: default
data_files:
- split: config32
path: data/config32-*
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: config32
num_bytes: 35728713
num_examples: 58432
download_size: 14314061
dataset_size: 35728713
---
# Dataset Card for "SkunkData-Corpus-Clusters"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_liminerity__Blur-7b-v1.2 | ---
pretty_name: Evaluation run of liminerity/Blur-7b-v1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [liminerity/Blur-7b-v1.2](https://huggingface.co/liminerity/Blur-7b-v1.2) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__Blur-7b-v1.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-18T13:00:27.961191](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7b-v1.2/blob/main/results_2024-01-18T13-00-27.961191.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6357129950975389,\n\
\ \"acc_stderr\": 0.03262066192131251,\n \"acc_norm\": 0.6382762311799055,\n\
\ \"acc_norm_stderr\": 0.03328259277014658,\n \"mc1\": 0.43451652386780903,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.6030326315591199,\n\
\ \"mc2_stderr\": 0.015260409379504259\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.01414419347189345,\n\
\ \"acc_norm\": 0.6535836177474402,\n \"acc_norm_stderr\": 0.013905011180063223\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6528579964150567,\n\
\ \"acc_stderr\": 0.00475088440109516,\n \"acc_norm\": 0.8387771360286795,\n\
\ \"acc_norm_stderr\": 0.0036698484004877773\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\"\
: 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8091743119266055,\n \"acc_stderr\": 0.016847676400091095,\n \"\
acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.016847676400091095\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808514,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808514\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464073,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464073\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4782122905027933,\n\
\ \"acc_stderr\": 0.016706617522176136,\n \"acc_norm\": 0.4782122905027933,\n\
\ \"acc_norm_stderr\": 0.016706617522176136\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427905,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195448,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195448\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n\
\ \"acc_stderr\": 0.012719949543032207,\n \"acc_norm\": 0.4556714471968709,\n\
\ \"acc_norm_stderr\": 0.012719949543032207\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6290849673202614,\n \"acc_stderr\": 0.019542101564854128,\n \
\ \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.019542101564854128\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43451652386780903,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.6030326315591199,\n\
\ \"mc2_stderr\": 0.015260409379504259\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8058405682715075,\n \"acc_stderr\": 0.01111698339239267\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5284306292645944,\n \
\ \"acc_stderr\": 0.013750202076584422\n }\n}\n```"
repo_url: https://huggingface.co/liminerity/Blur-7b-v1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|arc:challenge|25_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|gsm8k|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hellaswag|10_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-00-27.961191.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T13-00-27.961191.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- '**/details_harness|winogrande|5_2024-01-18T13-00-27.961191.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-18T13-00-27.961191.parquet'
- config_name: results
data_files:
- split: 2024_01_18T13_00_27.961191
path:
- results_2024-01-18T13-00-27.961191.parquet
- split: latest
path:
- results_2024-01-18T13-00-27.961191.parquet
---
# Dataset Card for Evaluation run of liminerity/Blur-7b-v1.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/Blur-7b-v1.2](https://huggingface.co/liminerity/Blur-7b-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__Blur-7b-v1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T13:00:27.961191](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Blur-7b-v1.2/blob/main/results_2024-01-18T13-00-27.961191.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6357129950975389,
"acc_stderr": 0.03262066192131251,
"acc_norm": 0.6382762311799055,
"acc_norm_stderr": 0.03328259277014658,
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.6030326315591199,
"mc2_stderr": 0.015260409379504259
},
"harness|arc:challenge|25": {
"acc": 0.6254266211604096,
"acc_stderr": 0.01414419347189345,
"acc_norm": 0.6535836177474402,
"acc_norm_stderr": 0.013905011180063223
},
"harness|hellaswag|10": {
"acc": 0.6528579964150567,
"acc_stderr": 0.00475088440109516,
"acc_norm": 0.8387771360286795,
"acc_norm_stderr": 0.0036698484004877773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.016847676400091095,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.016847676400091095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808514,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808514
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464073,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464073
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4782122905027933,
"acc_stderr": 0.016706617522176136,
"acc_norm": 0.4782122905027933,
"acc_norm_stderr": 0.016706617522176136
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195448,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195448
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032207,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032207
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.019542101564854128,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.019542101564854128
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43451652386780903,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.6030326315591199,
"mc2_stderr": 0.015260409379504259
},
"harness|winogrande|5": {
"acc": 0.8058405682715075,
"acc_stderr": 0.01111698339239267
},
"harness|gsm8k|5": {
"acc": 0.5284306292645944,
"acc_stderr": 0.013750202076584422
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pphuc25/bailamvan | ---
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 9514569
num_examples: 888
download_size: 4680823
dataset_size: 9514569
---
# Dataset Card for "bailamvan"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hecgo067/adv-ele | ---
dataset_info:
features:
- name: ADV
dtype: string
- name: ELE
dtype: string
splits:
- name: train
num_bytes: 430918.56140350876
num_examples: 1732
- name: test
num_bytes: 107978.43859649122
num_examples: 434
download_size: 296740
dataset_size: 538897.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
pdxmusic/splats | ---
license: apache-2.0
---
|
luizapzbn/goodtriever-data | ---
license: apache-2.0
---
# Goodtriever
This repository contains datasets and model generations from the `Goodtriever: Adaptive Toxicity Mitigation with Retrieval-augmented Models` paper, published as a conference paper on EMNLP 2023.
[[Paper]]()[[Code]]()[[Data]](https://huggingface.co/datasets/luizapzbn/goodtriever-data)
- `data`: prompts and datasets used for datastore creation.
- `continual_mitigation`: clustered WILDS data and prompts
- `datastore_quality`: for the experiments on how automatic labeling impacts mitigation results
- `jigsaw`: main dataset, jigsaw unintended bias
- `outputs`: model generations and results for all experiments from the paper.
- `alpha_temperature`
- `datastore_quality`
- `datastore_size`
- `k_neighbors`
- `model_families` (and main table results)
# Citation
|
roborovski/phi-2-labeled | ---
dataset_info:
features:
- name: hexsha
dtype: string
- name: size
dtype: int64
- name: ext
dtype: string
- name: lang
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_repo_head_hexsha
dtype: string
- name: max_stars_repo_licenses
sequence: string
- name: max_stars_count
dtype: int64
- name: max_stars_repo_stars_event_min_datetime
dtype: string
- name: max_stars_repo_stars_event_max_datetime
dtype: string
- name: max_issues_repo_path
dtype: string
- name: max_issues_repo_name
dtype: string
- name: max_issues_repo_head_hexsha
dtype: string
- name: max_issues_repo_licenses
sequence: string
- name: max_issues_count
dtype: int64
- name: max_issues_repo_issues_event_min_datetime
dtype: string
- name: max_issues_repo_issues_event_max_datetime
dtype: string
- name: max_forks_repo_path
dtype: string
- name: max_forks_repo_name
dtype: string
- name: max_forks_repo_head_hexsha
dtype: string
- name: max_forks_repo_licenses
sequence: string
- name: max_forks_count
dtype: int64
- name: max_forks_repo_forks_event_min_datetime
dtype: string
- name: max_forks_repo_forks_event_max_datetime
dtype: string
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
- name: label
dtype: int64
- name: cost
dtype: float64
splits:
- name: train
num_bytes: 283814677
num_examples: 50000
download_size: 112938830
dataset_size: 283814677
---
# Dataset Card for "phi-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
djfelipe/cava | ---
license: openrail
---
|
Nexdata/4001_People_Single_Object_Multi_view_Tracking_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
4,001 People Single Object Multi-view Tracking Data, the data collection site includes indoor and outdoor scenes (such as supermarket, mall and community, etc.) , where each subject appeared in at least 7 cameras. The data diversity includes different ages, different time periods, different cameras, different human body orientations and postures, different collecting scenes. It can be used for computer vision tasks such as object detection and object tracking in multi-view scenes.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1231?source=Huggingface
## Data size
4,001 people, about 385-2,779 images per person
## Race distribution
Asian
## Gender distribution
2,052 males, 1,949 females
## Age distribution
from children to the elderly
## Collecting environment
including indoor and outdoor scenes (such as supermarket, mall and community, etc.)
## Data diversity
different ages, different time periods, different cameras, different human body orientations and postures, different collecting scenes
## Device
surveillance cameras, the image resolution is not less 1,920*1,080
## Data format
the image data format is .jpg, the annotation file format is .json
## Annotation content
human body rectangular bounding boxes
## Accuracy
A rectangular bounding box of human body is qualified when the deviation is not more than 3 pixels, and the qualified rate of the bounding boxes shall not be lower than 97%
# Licensing Information
Commercial License
|
Loie/VGGSound | ---
task_categories:
- audio-classification
size_categories:
- 100B<n<1T
---
# VGGSound
VGG-Sound is an audio-visual correspondent dataset consisting of short clips of audio sounds, extracted from videos uploaded to YouTube.
- **Homepage:** https://www.robots.ox.ac.uk/~vgg/data/vggsound/
- **Paper:** https://arxiv.org/abs/2004.14368
- **Github:** https://github.com/hche11/VGGSound
## Analysis
- **310+ classes:** VGG-Sound contains audios spanning a large number of challenging acoustic environments and noise characteristics of real applications.
- **200,000+ videos:** All videos are captured "in the wild" with audio-visual correspondence in the sense that the sound source is visually evident.
- **550+ hours:** VGG-Sound consists of both audio and video. Each segment is 10 seconds long.

## Download
We provide a csv file. For each YouTube video, we provide YouTube URLs, time stamps, audio labels and train/test split. Each line in the csv file has columns defined by here.
```
# YouTube ID, start seconds, label, train/test split.
```
And you can download VGGSound directly from this [repository](https://huggingface.co/datasets/Loie/VGGSound/tree/main).
## License
The VGG-Sound dataset is available to download for commercial/research purposes under a Creative Commons Attribution 4.0 International License. The copyright remains with the original owners of the video. A complete version of the license can be found [here](https://thor.robots.ox.ac.uk/datasets/vggsound/license_vggsound.txt).
## Citation
Please cite the following if you make use of the dataset.
```
@InProceedings{Chen20,
author = "Honglie Chen and Weidi Xie and Andrea Vedaldi and Andrew Zisserman",
title = "VGGSound: A Large-scale Audio-Visual Dataset",
booktitle = "International Conference on Acoustics, Speech, and Signal Processing (ICASSP)",
year = "2020",
}
``` |
pacovaldez/stackoverflow-questions-2016 | ---
annotations_creators:
- machine-generated
language:
- en
language_creators:
- found
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: stackoverflow_post_questions
size_categories:
- 1M<n<10M
source_datasets:
- original
tags:
- stackoverflow
- technical questions
task_categories:
- text-classification
task_ids:
- multi-class-classification
---
# Dataset Card for [Stackoverflow Post Questions]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Source Data](#source-data)
- [Contributions](#contributions)
## Dataset Description
Companies that sell Open-source software tools usually hire an army of Customer representatives to try to answer every question asked about their tool. The first step in this process
is the prioritization of the question. The classification scale usually consists of 4 values, P0, P1, P2, and P3, with different meanings across every participant in the industry. On
the other hand, every software developer in the world has dealt with Stack Overflow (SO); the amount of shared knowledge there is incomparable to any other website. Questions in SO are
usually annotated and curated by thousands of people, providing metadata about the quality of the question. This dataset aims to provide an accurate prioritization for programming
questions.
### Dataset Summary
The dataset contains the title and body of stackoverflow questions and a label value(0,1,2,3) that was calculated using thresholds defined by SO badges.
### Languages
English
## Dataset Structure
title: string,
body: string,
label: int
### Data Splits
The split is 40/40/20, where classes have been balaned to be around the same size.
## Dataset Creation
The data set was extracted and labeled with the following query in BigQuery:
```
SELECT
title,
body,
CASE
WHEN score >= 100 OR favorite_count >= 100 OR view_count >= 10000 THEN 0
WHEN score >= 25 OR favorite_count >= 25 OR view_count >= 2500 THEN 1
WHEN score >= 10 OR favorite_count >= 10 OR view_count >= 1000 THEN 2
ELSE 3
END AS label
FROM `bigquery-public-data`.stackoverflow.posts_questions
```
### Source Data
The data was extracted from the Big Query public dataset: `bigquery-public-data.stackoverflow.posts_questions`
#### Initial Data Collection and Normalization
The original dataset contained high class imbalance:
label count
0 977424
1 2401534
2 3418179
3 16222990
Grand Total 23020127
The data was sampled from each class to have around the same amount of records on every class.
### Contributions
Thanks to [@pacofvf](https://github.com/pacofvf) for adding this dataset.
|
youssef101/artelingo | ---
license: other
task_categories:
- text-generation
- text-classification
- image-classification
- image-to-text
- text-to-image
language:
- en
- ar
- zh
tags:
- art
- Affective Captioning
- Emotions
- Emotion Prediction
- Image Captioning
- Multilingual
- Cultural
- Diversity
pretty_name: ArtELingo
size_categories:
- 10K<n<100K
- 100K<n<1M
- 1M<n<10M
multilinguality:
- multilingual
source_datasets:
- original
---
# Dataset Card for "ArtELingo"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Dataset Configurations](#dataset-configurations)
- [Data Fields](#data-fields)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [artelingo.org/](https://www.artelingo.org/)
- **Repository:** [More Information Needed](https://github.com/Vision-CAIR/artelingo)
- **Paper:** [More Information Needed](https://arxiv.org/abs/2211.10780)
- **Point of Contact:** [More Information Needed](artelingo.dataset@gmail.com)
### Dataset Summary
ArtELingo is a benchmark and dataset introduced in a research paper aimed at promoting work on diversity across languages and cultures.
It is an extension of ArtEmis, which is a collection of 80,000 artworks from WikiArt with 450,000 emotion labels and English-only captions.
ArtELingo expands this dataset by adding 790,000 annotations in Arabic and Chinese.
The purpose of these additional annotations is to evaluate the performance of "cultural-transfer" in AI systems.
The goal of ArtELingo is to encourage research on multilinguality and culturally-aware AI.
By including annotations in multiple languages and considering cultural differences,
the dataset aims to build more human-compatible AI that is sensitive to emotional nuances
across various cultural contexts. The researchers believe that studying emotions in this
way is crucial to understanding a significant aspect of human intelligence.
### Supported Tasks and Leaderboards
We have two tasks:
- [Emotion Label Prediction](https://eval.ai/web/challenges/challenge-page/2106/overview)
- [Affective Image Captioning](https://eval.ai/web/challenges/challenge-page/2104/overview)
Both challenges have a leaderboard on Eval.ai. Submission deadlines can be viewed from the above links.
In addition, we are hosting the challenge at the ICCV23 workshop [WECIA](https://iccv23-wecia.github.io/). We have cash prizes for winners.
### Languages
We have 3 languages: English, Arabic, and Chinese. For each image, we have at least 5 captions in each language.
In total we have 80,000 images which are downloaded automatically with the dataset.
## Dataset Structure
We show detailed information for all the configurations of the dataset.
### Dataset Configurations
We have 4 Configurations:
#### artelingo
- **Size of downloaded dataset files:** 23 GB
- **Splits:** \['train', 'test', 'val'\]
- **Number of Samples per splits:** \[920K, 94.1K, 46.9K\]
- **Loading Script**:
```python
from datasets import load_dataset
dataset = load_dataset(path="youssef101/artelingo", name='artelingo')
```
you can also provide a `splits:LIST(str)` parameter to avoid downloading the huge files for all the splits. (especially the train set :))
```python
from datasets import load_dataset
dataset = load_dataset(path="youssef101/artelingo", name='artelingo', splits=['val'])
```
Notice that this deems the next dev configuration redundant.
#### dev
- **Size of downloaded dataset files:** 3 GB
- **Splits:** \['test', 'val'\]
- **Number of Samples per splits:** \[94.1K, 46.9K\]
- **Loading Script**:
```python
from datasets import load_dataset
dataset = load_dataset(path="youssef101/artelingo", name='dev')
```
#### wecia-emo
Intended for the [WECIA](https://iccv23-wecia.github.io/) emotion prediction challenge. Instances does not have the emotion or the language attributes.
- **Size of downloaded dataset files:** 1.2 GB
- **Splits:** \['dev'\]
- **Number of Samples per splits:** \[27.9K\]
- **Loading Script**:
```python
from datasets import load_dataset
dataset = load_dataset(path="youssef101/artelingo", name='wecia-emo')
```
#### wecia-cap
Intended for the [WECIA](https://iccv23-wecia.github.io/) affective caption generation challenge. Instances does not have the text.
- **Size of downloaded dataset files:** 1.2 GB
- **Splits:** \['dev'\]
- **Number of Samples per splits:** \[16.3K\]
- **Loading Script**:
```python
from datasets import load_dataset
dataset = load_dataset(path="youssef101/artelingo", name='wecia-cap')
```
### Data Fields
The data fields are the same among all configs.
- `uid`: a `int32` feature. A unique identifier for each instance.
- `image`: a `PIL.Image` feature. The image of the artwork from the wikiart dataset.
- `art_style`: a `string` feature. The art style of the artwork. Styles are a subset from the [wikiart styles](https://www.wikiart.org/en/paintings-by-style).
- `painting`: a `string` feature. The name of the painting according to the wikiart dataset.
- `emotion`: a `string` feature. The emotion associated with the image caption pair.
- `language`: a `string` feature. The language used to write the caption.
- `text`: a `string` feature. The affective caption that describes the painting under the context of the selected emotion.
## Dataset Creation
### Curation Rationale
ArtELingo is a benchmark and dataset designed to promote research on diversity
across languages and cultures. It builds upon ArtEmis, a collection of 80,000
artworks from WikiArt with 450,000 emotion labels and English-only captions.
ArtELingo extends this dataset by adding 790,000 annotations in Arabic and
Chinese, as well as 4,800 annotations in Spanish, allowing for the evaluation
of "cultural-transfer" performance in AI systems. With many artworks having
multiple annotations in three languages, the dataset enables the investigation
of similarities and differences across linguistic and cultural contexts.
Additionally, ArtELingo explores captioning tasks, demonstrating how diversity
in annotations can improve the performance of baseline AI models. The hope is
that ArtELingo will facilitate future research on multilinguality and
culturally-aware AI. The dataset is publicly available, including standard
splits and baseline models, to support and ease further research in this area.
### Source Data
#### Initial Data Collection and Normalization
ArtELingo uses images from the [wikiart dataset](https://www.wikiart.org/).
The images are mainly artworks since they are created with the intention to
have an emotional impact on the viewer. ArtELingo assumes that WikiArt
is a representative sample of the cultures of interest. While WikiArt
is remarkably comprehensive, it has better coverage of the West than other
regions of the world based on WikiArt’s assignment of artworks to nationalities.
The data was collected via Amazon Mechanical Turk, where only native speakers
were allowed to annotate the images. The English, Arabic, and Chinese subsets were
collected by 6377, 656, and 745 workers respectively. All workers were compensated
with above minimal wage in each respective country.
#### Who are the source language producers?
The data comes from Human annotators who natively speak each respective language.
## Considerations for Using the Data
### Social Impact of Dataset
When using the ArtELingo dataset, researchers and developers must be mindful of
the potential social impact of the data. Emotions, cultural expressions, and
artistic representations can be sensitive topics, and AI systems trained on such
data may have implications on how they perceive and respond to users. It is
crucial to ensure that the dataset's usage does not perpetuate stereotypes or
biases related to specific cultures or languages. Ethical considerations should
be taken into account during the development and deployment of AI models trained
on ArtELingo to avoid any harmful consequences on individuals or communities.
### Discussion of Biases
ArtELingo was filtered against hate speech, racism, and obvious stereotypes.
However, Like any dataset, ArtELingo may contain inherent biases that could
influence the performance and behavior of AI systems. These biases could
arise from various sources, such as cultural differences in emotional
interpretations, variations in annotator perspectives, or imbalances in
the distribution of annotations across languages and cultures. Researchers
should be cautious about potential biases that might impact the dataset's
outcomes and address them appropriately. Transparently discussing and
documenting these biases is essential to facilitate a fair understanding of the
dataset's limitations and potential areas of improvement.
## Additional Information
### Dataset Curators
The corpus was put together by [Youssef Mohamed](https://cemse.kaust.edu.sa/people/person/youssef-s-mohamed),
[Mohamed Abdelfattah](https://people.epfl.ch/mohamed.abdelfattah/?lang=en),
[Shyma Alhuwaider](https://cemse.kaust.edu.sa/aanslab/people/person/shyma-y-alhuwaider),
[Feifan Li](https://www.linkedin.com/in/feifan-li-3280a6249/),
[Xiangliang Zhang](https://engineering.nd.edu/faculty/xiangliang-zhang/),
[Kenneth Ward Church](https://www.khoury.northeastern.edu/people/kenneth-church/)
and [Mohamed Elhoseiny](https://cemse.kaust.edu.sa/people/person/mohamed-elhoseiny).
### Licensing Information
Terms of Use: Before we are able to offer you access to the database,
please agree to the following terms of use. After approval, you (the 'Researcher')
receive permission to use the ArtELingo database (the 'Database') at King Abdullah
University of Science and Technology (KAUST). In exchange for being able to join the
ArtELingo community and receive such permission, Researcher hereby agrees to the
following terms and conditions: [1.] The Researcher shall use the Database only for
non-commercial research and educational purposes. [2.] The Universities make no
representations or warranties regarding the Database, including but not limited to
warranties of non-infringement or fitness for a particular purpose. [3.] Researcher
accepts full responsibility for his or her use of the Database and shall defend and
indemnify the Universities, including their employees, Trustees, officers and agents,
against any and all claims arising from Researcher's use of the Database, and
Researcher's use of any copies of copyrighted 2D artworks originally uploaded to
http://www.wikiart.org that the Researcher may use in connection with the Database.
[4.] Researcher may provide research associates and colleagues with access to the
Database provided that they first agree to be bound by these terms and conditions.
[5.] The Universities reserve the right to terminate Researcher's access to the Database
at any time. [6.] If Researcher is employed by a for-profit, commercial entity,
Researcher's employer shall also be bound by these terms and conditions, and Researcher
hereby represents that he or she is fully authorized to enter into this agreement on
behalf of such employer. [7.] The international copyright laws shall apply to all
disputes under this agreement.
### Citation Information
```
@inproceedings{mohamed2022artelingo,
title={ArtELingo: A Million Emotion Annotations of WikiArt with Emphasis on Diversity over Language and Culture},
author={Mohamed, Youssef and Abdelfattah, Mohamed and Alhuwaider, Shyma and Li, Feifan and Zhang, Xiangliang and Church, Kenneth and Elhoseiny, Mohamed},
booktitle={Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing},
pages={8770--8785},
year={2022}
}
```
### Contributions
Thanks to [@youssef101](https://github.com/Mo-youssef) for adding this dataset. [@Faizan](https://faixan-khan.github.io/) for testing. |
Minata/70000_method2test_tokonized_ForCausalLM | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: test
num_bytes: 598079592
num_examples: 89694
download_size: 109394438
dataset_size: 598079592
---
# Dataset Card for "70000_method2test_tokonized_ForCausalLM"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lshowway/wikipedia.reorder.osv.fr | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 886603410
num_examples: 490371
download_size: 404858868
dataset_size: 886603410
---
# Dataset Card for "wikipedia.reorder.osv.fr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-professional_psychology | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 5774
num_examples: 5
- name: test
num_bytes: 3081725
num_examples: 612
download_size: 304580
dataset_size: 3087499
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-professional_psychology"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hanabatake_yoshiko_ahogirl | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Hanabatake Yoshiko
This is the dataset of Hanabatake Yoshiko, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 463 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 463 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 463 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 463 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
MadNLP769/isacarsm | ---
license: mit
---
|
Jzuluaga/uwb_atcc | ---
dataset_info:
features:
- name: id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: segment_start_time
dtype: float32
- name: segment_end_time
dtype: float32
- name: duration
dtype: float32
splits:
- name: test
num_bytes: 140620332.25
num_examples: 2822
- name: train
num_bytes: 608597323.625
num_examples: 11291
download_size: 711464914
dataset_size: 749217655.875
tags:
- audio
- automatic-speech-recognition
- en-atc
- en
- noisy-speech-recognition
- speech-recognition
task_categories:
- automatic-speech-recognition
language:
- en
multilinguality:
- monolingual
license:
- cc-by-nc-sa-4.0
---
# Dataset Card for UWB-ATCC corpus
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages and Other Details](#languages-and-other-details)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [UWB-ATCC corpus homepage](https://lindat.mff.cuni.cz/repository/xmlui/handle/11858/00-097C-0000-0001-CCA1-0)
- **Repository:** [GitHub repository (used in research)](https://github.com/idiap/w2v2-air-traffic)
- **Paper:** [Air traffic control communication (ATCC) speech corpora and their use for ASR and TTS development](https://link.springer.com/article/10.1007/s10579-019-09449-5)
- **Paper of this research:** [How Does Pre-trained Wav2Vec 2.0 Perform on Domain Shifted ASR? An Extensive Benchmark on Air Traffic Control Communications](https://arxiv.org/abs/2203.16822)
### Dataset Summary
The UWB-ATCC Corpus is provided provided by University of West Bohemia, Department of Cybernetics. The corpus contains recordings of communication between air traffic controllers and pilots. The speech is manually transcribed and labeled with the information about the speaker (pilot/controller, not the full identity of the person). The corpus is currently small (20 hours) but we plan to search for additional data next year. The audio data format is: 8kHz, 16bit PCM, mono.
Important, from the `<id (string)>` field, you can obtain the speaker roles. For instance:
- `_PI`: segment with only pilot speech
- `_AT`: segment with only ATCO speech
- `PIAT`: segment with both, ATCO and pilot speech
### Supported Tasks and Leaderboards
- `automatic-speech-recognition`. Already adapted/fine-tuned models are available here --> [XLS-R-300m](https://huggingface.co/Jzuluaga/wav2vec2-large-960h-lv60-self-en-atc-atcosim).
### Languages and other details
The text and the recordings are in English. The authors took advantage of the fact that one of their industrial partners develops complex IT solutions for several ATC authorities and airports and, as such, has access to the ATC communication recordings collected in the Czech airspace. This partner was able to secure the following data:
- Ground control—communication before takeoff and after landing—19.2 h of data.
- Tower control—communication during takeoff, landing and landing standby—22.5 h.
- Approach control—communication during landing approach—25.5 h.
- Area control—communication during overflights and cruises—71.3 h.
(Not all data is released. Check their website [here](https://lindat.mff.cuni.cz/repository/xmlui/handle/11858/00-097C-0000-0001-CCA1-0))
## Dataset Structure
### Data Fields
- `id (string)`: a string of recording identifier for each example, corresponding to its.
- `audio (audio)`: audio data for the given ID
- `text (string)`: transcript of the file already normalized. Follow these repositories for more details [w2v2-air-traffic](https://github.com/idiap/w2v2-air-traffic) and [bert-text-diarization-atc](https://github.com/idiap/bert-text-diarization-atc)
- `segment_start_time (float32)`: segment start time (normally 0)
- `segment_end_time (float32): segment end time
- `duration (float32)`: duration of the recording, compute as segment_end_time - segment_start_time
## Additional Information
### Licensing Information
The licensing status of the dataset hinges on the legal status of the [UWB-ATCC corpus](https://lindat.mff.cuni.cz/repository/xmlui/handle/11858/00-097C-0000-0001-CCA1-0) creators.
They used [Creative Commons - Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0)](https://creativecommons.org/licenses/by-nc-sa/4.0/) licensing.
### Citation Information
Contributors who prepared, processed, normalized and uploaded the dataset in HuggingFace:
```
@article{zuluaga2022how,
title={How Does Pre-trained Wav2Vec2. 0 Perform on Domain Shifted ASR? An Extensive Benchmark on Air Traffic Control Communications},
author={Zuluaga-Gomez, Juan and Prasad, Amrutha and Nigmatulina, Iuliia and Sarfjoo, Saeed and others},
journal={IEEE Spoken Language Technology Workshop (SLT), Doha, Qatar},
year={2022}
}
@article{zuluaga2022bertraffic,
title={BERTraffic: BERT-based Joint Speaker Role and Speaker Change Detection for Air Traffic Control Communications},
author={Zuluaga-Gomez, Juan and Sarfjoo, Seyyed Saeed and Prasad, Amrutha and others},
journal={IEEE Spoken Language Technology Workshop (SLT), Doha, Qatar},
year={2022}
}
@article{zuluaga2022atco2,
title={ATCO2 corpus: A Large-Scale Dataset for Research on Automatic Speech Recognition and Natural Language Understanding of Air Traffic Control Communications},
author={Zuluaga-Gomez, Juan and Vesel{\`y}, Karel and Sz{\"o}ke, Igor and Motlicek, Petr and others},
journal={arXiv preprint arXiv:2211.04054},
year={2022}
}
```
Authors of the dataset:
```
@article{vsmidl2019air,
title={Air traffic control communication (ATCC) speech corpora and their use for ASR and TTS development},
author={{\v{S}}m{\'\i}dl, Lubo{\v{s}} and {\v{S}}vec, Jan and Tihelka, Daniel and Matou{\v{s}}ek, Jind{\v{r}}ich and Romportl, Jan and Ircing, Pavel},
journal={Language Resources and Evaluation},
volume={53},
number={3},
pages={449--464},
year={2019},
publisher={Springer}
}
```
|
McSpicyWithMilo/instruction-types-0.3split | ---
dataset_info:
features:
- name: instruction_type
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 24468
num_examples: 280
- name: test
num_bytes: 10561
num_examples: 120
download_size: 18875
dataset_size: 35029
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "instruction-types-0.3split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TanveerAman/AMI-Corpus-Text-Summarization | ---
task_categories:
- summarization
language:
- en
--- |
CognitiveScience/data2 | ---
license: mit
from: https://huggingface.co/datasets/saranya132/dialog_uid_gpt2
---
|
amitness/logits-mt-ar-512 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- name: teacher_logits
sequence:
sequence: float64
- name: teacher_indices
sequence:
sequence: int64
- name: teacher_mask_indices
sequence: int64
splits:
- name: train
num_bytes: 17102298098.701529
num_examples: 940987
- name: test
num_bytes: 3018061158.5240602
num_examples: 166057
download_size: 7348415360
dataset_size: 20120359257.22559
---
# Dataset Card for "logits-mt-ar-512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CATIE-AQ/universal_dependencies_fr_gsd_fr_prompt_pos | ---
language:
- fr
license: cc-by-sa-4.0
size_categories:
- 100K<n<1M
task_categories:
- token-classification
tags:
- pos
- DFP
- french prompts
annotations_creators:
- found
language_creators:
- found
multilinguality:
- monolingual
source_datasets:
- universal_dependencies_fr_gsd
---
# universal_dependencies_fr_gsd_fr_prompt_pos
## Summary
**universal_dependencies_fr_gsd_fr_prompt_pos** is a subset of the [**Dataset of French Prompts (DFP)**](https://huggingface.co/datasets/CATIE-AQ/DFP).
It contains **343,161** rows that can be used for a part-of-speech task.
The original data (without prompts) comes from the dataset [universal_dependencies](https://huggingface.co/datasets/universal_dependencies) where only the French gsd split has been kept.
A list of prompts (see below) was then applied in order to build the input and target columns and thus obtain the same format as the [xP3](https://huggingface.co/datasets/bigscience/xP3) dataset by Muennighoff et al.
## Prompts used
### List
21 prompts were created for this dataset. The logic applied consists in proposing prompts in the indicative tense, in the form of tutoiement and in the form of vouvoiement.
```
'Extraire les classes des mots du texte suivant : '+text,
'Extrais les classes des mots du texte suivant : '+text,
'Extrayez les classes des mots du texte suivant : '+text,
'Isoler les classes des mots du texte suivant : '+text,
'Isole les classes des mots du texte suivant : '+text,
'Isolez les classes des mots du texte suivant : '+text,
'Dégager les classes des mots dans le texte : '+text,
'Dégage les classes des mots dans le texte : '+text,
'Dégagez les classes des mots dans le texte : '+text,
'Générer les classes des mots issues du texte suivant : '+text,
'Génère les classes des mots issues du texte suivant : '+text,
'Générez les classes des mots issues du texte suivant : '+text,
'Trouver les classes des mots du texte : '+text,
'Trouve les classes des mots du texte : '+text,
'Trouvez les classes des mots du texte : '+text,
'Repérer les classes des mots présentes dans le texte suivant : '+text,
'Repère les classes des mots présentes dans le texte suivant : '+text,
'Repérez les classes des mots présentes dans le texte suivant : '+text,
'Indiquer les classes des mots du texte :'+text,
'Indique les classes des mots du texte : '+text,
'Indiquez les classes des mots du texte : '+text
```
### Features used in the prompts
In the prompt list above, `text` and `targets` have been constructed from:
```
fr_gsd = load_dataset('universal_dependencies', 'fr_gsd')
# text
fr_gsd['train']['tokens'] = list(map(lambda i: ' '.join(fr_gsd['train']['tokens'][i]), range(len(fr_gsd['train']['tokens']))))
# targets
fr_gsd['train']['upos'] = list(map(lambda x: x.replace("[","").replace("]","").replace('17','AUX').replace('16','VERB').replace('15','INTJ').replace('14','ADV').replace('13','_').replace('12','X').replace('11','PRON').replace('10','PROPN').replace('9','CCONJ').replace('8','DET').replace('7','PART').replace('6','ADJ').replace('5','SCONJ').replace('4','SYM').replace('3','NUM').replace('2','ADP').replace('1','PUNCT').replace('0','NOUN'), map(str,fr_gsd['train']['upos'])))
```
# Splits
- `train` with 303,429 samples
- `valid` with 30,996 samples
- `test` with 8,736 samples
# How to use?
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/universal_dependencies_fr_gsd_fr_prompt_pos")
```
# Citation
## Original data
> Contributors: de Marneffe, Marie-Catherine; Guillaume, Bruno; McDonald, Ryan; Suhr, Alane; Nivre, Joakim; Grioni, Matias; Dickerson, Carly; Perrier, Guy
## This Dataset
> @misc {centre_aquitain_des_technologies_de_l'information_et_electroniques_2023,
author = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { DFP (Revision 1d24c09) },
year = 2023,
url = { https://huggingface.co/datasets/CATIE-AQ/DFP },
doi = { 10.57967/hf/1200 },
publisher = { Hugging Face }
}
## License
CC BY-SA 4.0 |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/52026443 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1342
dataset_size: 188
---
# Dataset Card for "52026443"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mamakhan/Tools | ---
license: openrail
---
|
irds/antique_train_split200-valid | ---
pretty_name: '`antique/train/split200-valid`'
viewer: false
source_datasets: ['irds/antique']
task_categories:
- text-retrieval
---
# Dataset Card for `antique/train/split200-valid`
The `antique/train/split200-valid` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/antique#antique/train/split200-valid).
# Data
This dataset provides:
- `queries` (i.e., topics); count=200
- `qrels`: (relevance assessments); count=2,193
- For `docs`, use [`irds/antique`](https://huggingface.co/datasets/irds/antique)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/antique_train_split200-valid', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/antique_train_split200-valid', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Hashemi2020Antique,
title={ANTIQUE: A Non-Factoid Question Answering Benchmark},
author={Helia Hashemi and Mohammad Aliannejadi and Hamed Zamani and Bruce Croft},
booktitle={ECIR},
year={2020}
}
```
|
heliosprime/twitter_dataset_1713090697 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 6144
num_examples: 15
download_size: 10882
dataset_size: 6144
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713090697"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
seonglae/resrer-nq | ---
dataset_info:
features:
- name: document_text
dtype: string
- name: long_answer_candidates
list:
- name: end_token
dtype: int64
- name: start_token
dtype: int64
- name: top_level
dtype: bool
- name: question_text
dtype: string
- name: annotations
list:
- name: annotation_id
dtype: float64
- name: long_answer
struct:
- name: candidate_index
dtype: int64
- name: end_token
dtype: int64
- name: start_token
dtype: int64
- name: short_answers
list:
- name: end_token
dtype: int64
- name: start_token
dtype: int64
- name: yes_no_answer
dtype: string
- name: document_url
dtype: string
- name: example_id
dtype: int64
- name: long_answer_text
dtype: string
- name: short_answer_text
dtype: string
- name: split_id
dtype: string
- name: answer_exist_chunk
dtype: bool
- name: summarization_text
dtype: string
splits:
- name: train
num_bytes: 497862929.4517183
num_examples: 55113
download_size: 121306017
dataset_size: 497862929.4517183
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cstech/ssd-warp-2024 | ---
license: deepfloyd-if-license
---
|
CyberHarem/kurosaki_honoka_encouragementofclimb | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kurosaki Honoka
This is the dataset of Kurosaki Honoka, containing 82 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 82 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 196 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 228 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 82 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 82 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 82 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 196 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 196 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 172 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 228 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 228 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
joey234/mmlu-professional_medicine-rule-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 440522
num_examples: 272
download_size: 250093
dataset_size: 440522
---
# Dataset Card for "mmlu-professional_medicine-rule-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joheras/adv-ele | ---
dataset_info:
features:
- name: ADV
dtype: string
- name: ELE
dtype: string
splits:
- name: train
num_bytes: 430918.56140350876
num_examples: 1732
- name: test
num_bytes: 107978.43859649122
num_examples: 434
download_size: 299002
dataset_size: 538897.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "adv-ele"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Isotonic__Mixnueza-Chat-6x32M-MoE | ---
pretty_name: Evaluation run of Isotonic/Mixnueza-Chat-6x32M-MoE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Isotonic/Mixnueza-Chat-6x32M-MoE](https://huggingface.co/Isotonic/Mixnueza-Chat-6x32M-MoE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Isotonic__Mixnueza-Chat-6x32M-MoE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-07T02:09:55.470077](https://huggingface.co/datasets/open-llm-leaderboard/details_Isotonic__Mixnueza-Chat-6x32M-MoE/blob/main/results_2024-04-07T02-09-55.470077.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25475467189344064,\n\
\ \"acc_stderr\": 0.030639762090785793,\n \"acc_norm\": 0.2552581810114775,\n\
\ \"acc_norm_stderr\": 0.03144935013039305,\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.015298077509485083,\n \"mc2\": 0.4727026528122458,\n\
\ \"mc2_stderr\": 0.015699277111857743\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.18088737201365188,\n \"acc_stderr\": 0.011248574467407034,\n\
\ \"acc_norm\": 0.20392491467576793,\n \"acc_norm_stderr\": 0.011774262478702256\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2629954192391954,\n\
\ \"acc_stderr\": 0.004393601887506585,\n \"acc_norm\": 0.26528579964150567,\n\
\ \"acc_norm_stderr\": 0.004405829993258718\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891356,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891356\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.030631145539198816,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.030631145539198816\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.028346963777162452,\n\
\ \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.028346963777162452\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.19298245614035087,\n\
\ \"acc_stderr\": 0.037124548537213684,\n \"acc_norm\": 0.19298245614035087,\n\
\ \"acc_norm_stderr\": 0.037124548537213684\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438014,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438014\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n\
\ \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20202020202020202,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.03221024508041154,\n\
\ \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.03221024508041154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3564102564102564,\n \"acc_stderr\": 0.024283140529467295,\n\
\ \"acc_norm\": 0.3564102564102564,\n \"acc_norm_stderr\": 0.024283140529467295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3025210084033613,\n \"acc_stderr\": 0.02983796238829193,\n \
\ \"acc_norm\": 0.3025210084033613,\n \"acc_norm_stderr\": 0.02983796238829193\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763744,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763744\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22752293577981653,\n \"acc_stderr\": 0.017974463578776502,\n \"\
acc_norm\": 0.22752293577981653,\n \"acc_norm_stderr\": 0.017974463578776502\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.22058823529411764,\n\
\ \"acc_stderr\": 0.02910225438967409,\n \"acc_norm\": 0.22058823529411764,\n\
\ \"acc_norm_stderr\": 0.02910225438967409\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n\
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.27802690582959644,\n\
\ \"acc_stderr\": 0.030069584874494047,\n \"acc_norm\": 0.27802690582959644,\n\
\ \"acc_norm_stderr\": 0.030069584874494047\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3128834355828221,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.3128834355828221,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.19642857142857142,\n\
\ \"acc_stderr\": 0.03770970049347018,\n \"acc_norm\": 0.19642857142857142,\n\
\ \"acc_norm_stderr\": 0.03770970049347018\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2264957264957265,\n\
\ \"acc_stderr\": 0.027421007295392926,\n \"acc_norm\": 0.2264957264957265,\n\
\ \"acc_norm_stderr\": 0.027421007295392926\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2848020434227331,\n\
\ \"acc_stderr\": 0.01613917409652258,\n \"acc_norm\": 0.2848020434227331,\n\
\ \"acc_norm_stderr\": 0.01613917409652258\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.22832369942196531,\n \"acc_stderr\": 0.022598703804321635,\n\
\ \"acc_norm\": 0.22832369942196531,\n \"acc_norm_stderr\": 0.022598703804321635\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n\
\ \"acc_stderr\": 0.025755865922632924,\n \"acc_norm\": 0.28938906752411575,\n\
\ \"acc_norm_stderr\": 0.025755865922632924\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.023576881744005723,\n\
\ \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.023576881744005723\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843007,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843007\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23728813559322035,\n\
\ \"acc_stderr\": 0.010865436690780264,\n \"acc_norm\": 0.23728813559322035,\n\
\ \"acc_norm_stderr\": 0.010865436690780264\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2173202614379085,\n \"acc_stderr\": 0.016684820929148587,\n \
\ \"acc_norm\": 0.2173202614379085,\n \"acc_norm_stderr\": 0.016684820929148587\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.23265306122448978,\n \"acc_stderr\": 0.02704925791589618,\n\
\ \"acc_norm\": 0.23265306122448978,\n \"acc_norm_stderr\": 0.02704925791589618\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.036643147772880864,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.036643147772880864\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.015298077509485083,\n \"mc2\": 0.4727026528122458,\n\
\ \"mc2_stderr\": 0.015699277111857743\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.505130228887135,\n \"acc_stderr\": 0.01405174596179052\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Isotonic/Mixnueza-Chat-6x32M-MoE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|arc:challenge|25_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|gsm8k|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hellaswag|10_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|winogrande|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-07T02-09-55.470077.parquet'
- config_name: results
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- results_2024-04-07T02-09-55.470077.parquet
- split: latest
path:
- results_2024-04-07T02-09-55.470077.parquet
---
# Dataset Card for Evaluation run of Isotonic/Mixnueza-Chat-6x32M-MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Isotonic/Mixnueza-Chat-6x32M-MoE](https://huggingface.co/Isotonic/Mixnueza-Chat-6x32M-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Isotonic__Mixnueza-Chat-6x32M-MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-07T02:09:55.470077](https://huggingface.co/datasets/open-llm-leaderboard/details_Isotonic__Mixnueza-Chat-6x32M-MoE/blob/main/results_2024-04-07T02-09-55.470077.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25475467189344064,
"acc_stderr": 0.030639762090785793,
"acc_norm": 0.2552581810114775,
"acc_norm_stderr": 0.03144935013039305,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.015298077509485083,
"mc2": 0.4727026528122458,
"mc2_stderr": 0.015699277111857743
},
"harness|arc:challenge|25": {
"acc": 0.18088737201365188,
"acc_stderr": 0.011248574467407034,
"acc_norm": 0.20392491467576793,
"acc_norm_stderr": 0.011774262478702256
},
"harness|hellaswag|10": {
"acc": 0.2629954192391954,
"acc_stderr": 0.004393601887506585,
"acc_norm": 0.26528579964150567,
"acc_norm_stderr": 0.004405829993258718
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.025288394502891356,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.025288394502891356
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198816,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198816
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.028346963777162452,
"acc_norm": 0.251063829787234,
"acc_norm_stderr": 0.028346963777162452
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.19298245614035087,
"acc_stderr": 0.037124548537213684,
"acc_norm": 0.19298245614035087,
"acc_norm_stderr": 0.037124548537213684
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438014,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438014
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.03192271569548299,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.03192271569548299
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.03221024508041154,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.03221024508041154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3564102564102564,
"acc_stderr": 0.024283140529467295,
"acc_norm": 0.3564102564102564,
"acc_norm_stderr": 0.024283140529467295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3025210084033613,
"acc_stderr": 0.02983796238829193,
"acc_norm": 0.3025210084033613,
"acc_norm_stderr": 0.02983796238829193
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763744,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763744
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22752293577981653,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.22752293577981653,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.02910225438967409,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.02910225438967409
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.27802690582959644,
"acc_stderr": 0.030069584874494047,
"acc_norm": 0.27802690582959644,
"acc_norm_stderr": 0.030069584874494047
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3128834355828221,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.3128834355828221,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.19642857142857142,
"acc_stderr": 0.03770970049347018,
"acc_norm": 0.19642857142857142,
"acc_norm_stderr": 0.03770970049347018
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2264957264957265,
"acc_stderr": 0.027421007295392926,
"acc_norm": 0.2264957264957265,
"acc_norm_stderr": 0.027421007295392926
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2848020434227331,
"acc_stderr": 0.01613917409652258,
"acc_norm": 0.2848020434227331,
"acc_norm_stderr": 0.01613917409652258
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22832369942196531,
"acc_stderr": 0.022598703804321635,
"acc_norm": 0.22832369942196531,
"acc_norm_stderr": 0.022598703804321635
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.28938906752411575,
"acc_stderr": 0.025755865922632924,
"acc_norm": 0.28938906752411575,
"acc_norm_stderr": 0.025755865922632924
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2345679012345679,
"acc_stderr": 0.023576881744005723,
"acc_norm": 0.2345679012345679,
"acc_norm_stderr": 0.023576881744005723
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843007,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843007
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23728813559322035,
"acc_stderr": 0.010865436690780264,
"acc_norm": 0.23728813559322035,
"acc_norm_stderr": 0.010865436690780264
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2173202614379085,
"acc_stderr": 0.016684820929148587,
"acc_norm": 0.2173202614379085,
"acc_norm_stderr": 0.016684820929148587
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23265306122448978,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.23265306122448978,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.036643147772880864,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.036643147772880864
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.015298077509485083,
"mc2": 0.4727026528122458,
"mc2_stderr": 0.015699277111857743
},
"harness|winogrande|5": {
"acc": 0.505130228887135,
"acc_stderr": 0.01405174596179052
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Li-Tang/cn_text | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_cognitivecomputations__dolphin-2.8-experiment26-7b | ---
pretty_name: Evaluation run of cognitivecomputations/dolphin-2.8-experiment26-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cognitivecomputations/dolphin-2.8-experiment26-7b](https://huggingface.co/cognitivecomputations/dolphin-2.8-experiment26-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__dolphin-2.8-experiment26-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-05T00:47:48.033781](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.8-experiment26-7b/blob/main/results_2024-03-05T00-47-48.033781.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6259859767682346,\n\
\ \"acc_stderr\": 0.032722228933055555,\n \"acc_norm\": 0.6269857052836199,\n\
\ \"acc_norm_stderr\": 0.03338852136792568,\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.01699762787190793,\n \"mc2\": 0.5510246746247679,\n\
\ \"mc2_stderr\": 0.015278599523000265\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6040955631399317,\n \"acc_stderr\": 0.014291228393536588,\n\
\ \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068285\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6484763991236805,\n\
\ \"acc_stderr\": 0.004764703145680275,\n \"acc_norm\": 0.8369846644094802,\n\
\ \"acc_norm_stderr\": 0.003686247559361841\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080342,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080342\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n\
\ \"acc_stderr\": 0.046550104113196177,\n \"acc_norm\": 0.3235294117647059,\n\
\ \"acc_norm_stderr\": 0.046550104113196177\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n\
\ \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n\
\ \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"\
acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.024580028921481006,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.024580028921481006\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \
\ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.0159901548850734,\n \"acc_norm\"\
: 0.8330275229357799,\n \"acc_norm_stderr\": 0.0159901548850734\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n\
\ \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n\
\ \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n\
\ \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7956577266922095,\n\
\ \"acc_stderr\": 0.014419123980931895,\n \"acc_norm\": 0.7956577266922095,\n\
\ \"acc_norm_stderr\": 0.014419123980931895\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n\
\ \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3027932960893855,\n\
\ \"acc_stderr\": 0.015366860386397112,\n \"acc_norm\": 0.3027932960893855,\n\
\ \"acc_norm_stderr\": 0.015366860386397112\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388995,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388995\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.025557653981868052,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.025557653981868052\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n\
\ \"acc_stderr\": 0.012719949543032207,\n \"acc_norm\": 0.4556714471968709,\n\
\ \"acc_norm_stderr\": 0.012719949543032207\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6213235294117647,\n \"acc_stderr\": 0.02946513363977613,\n\
\ \"acc_norm\": 0.6213235294117647,\n \"acc_norm_stderr\": 0.02946513363977613\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6470588235294118,\n \"acc_stderr\": 0.019333142020797157,\n \
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.019333142020797157\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368032,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368032\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.01699762787190793,\n \"mc2\": 0.5510246746247679,\n\
\ \"mc2_stderr\": 0.015278599523000265\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7876874506708761,\n \"acc_stderr\": 0.011493384687249787\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6262319939347991,\n \
\ \"acc_stderr\": 0.013326342860737018\n }\n}\n```"
repo_url: https://huggingface.co/cognitivecomputations/dolphin-2.8-experiment26-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|arc:challenge|25_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|arc:challenge|25_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|gsm8k|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|gsm8k|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hellaswag|10_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hellaswag|10_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T20-37-13.780824.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T00-47-48.033781.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-05T00-47-48.033781.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- '**/details_harness|winogrande|5_2024-03-02T20-37-13.780824.parquet'
- split: 2024_03_05T00_47_48.033781
path:
- '**/details_harness|winogrande|5_2024-03-05T00-47-48.033781.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-05T00-47-48.033781.parquet'
- config_name: results
data_files:
- split: 2024_03_02T20_37_13.780824
path:
- results_2024-03-02T20-37-13.780824.parquet
- split: 2024_03_05T00_47_48.033781
path:
- results_2024-03-05T00-47-48.033781.parquet
- split: latest
path:
- results_2024-03-05T00-47-48.033781.parquet
---
# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.8-experiment26-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cognitivecomputations/dolphin-2.8-experiment26-7b](https://huggingface.co/cognitivecomputations/dolphin-2.8-experiment26-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__dolphin-2.8-experiment26-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-05T00:47:48.033781](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.8-experiment26-7b/blob/main/results_2024-03-05T00-47-48.033781.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6259859767682346,
"acc_stderr": 0.032722228933055555,
"acc_norm": 0.6269857052836199,
"acc_norm_stderr": 0.03338852136792568,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190793,
"mc2": 0.5510246746247679,
"mc2_stderr": 0.015278599523000265
},
"harness|arc:challenge|25": {
"acc": 0.6040955631399317,
"acc_stderr": 0.014291228393536588,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068285
},
"harness|hellaswag|10": {
"acc": 0.6484763991236805,
"acc_stderr": 0.004764703145680275,
"acc_norm": 0.8369846644094802,
"acc_norm_stderr": 0.003686247559361841
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.02845015479411864,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.02845015479411864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080342,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080342
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851105,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851105
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481006,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481006
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.0159901548850734,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.0159901548850734
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7956577266922095,
"acc_stderr": 0.014419123980931895,
"acc_norm": 0.7956577266922095,
"acc_norm_stderr": 0.014419123980931895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3027932960893855,
"acc_stderr": 0.015366860386397112,
"acc_norm": 0.3027932960893855,
"acc_norm_stderr": 0.015366860386397112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388995,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388995
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.025557653981868052,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.025557653981868052
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032207,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032207
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6213235294117647,
"acc_stderr": 0.02946513363977613,
"acc_norm": 0.6213235294117647,
"acc_norm_stderr": 0.02946513363977613
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.019333142020797157,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.019333142020797157
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368032,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368032
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190793,
"mc2": 0.5510246746247679,
"mc2_stderr": 0.015278599523000265
},
"harness|winogrande|5": {
"acc": 0.7876874506708761,
"acc_stderr": 0.011493384687249787
},
"harness|gsm8k|5": {
"acc": 0.6262319939347991,
"acc_stderr": 0.013326342860737018
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/hazuki_ren_lovelivesuperstar | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hazuki_ren/葉月恋 (Love Live! Superstar!!)
This is the dataset of hazuki_ren/葉月恋 (Love Live! Superstar!!), containing 474 images and their tags.
The core tags of this character are `black_hair, long_hair, yellow_eyes, bangs, ponytail, high_ponytail, bow, breasts, ribbon, hair_bow, shiny_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 474 | 638.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hazuki_ren_lovelivesuperstar/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 474 | 331.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hazuki_ren_lovelivesuperstar/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1112 | 718.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hazuki_ren_lovelivesuperstar/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 474 | 546.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hazuki_ren_lovelivesuperstar/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1112 | 1.08 GiB | [Download](https://huggingface.co/datasets/CyberHarem/hazuki_ren_lovelivesuperstar/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hazuki_ren_lovelivesuperstar',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, blue_jacket, grey_dress, long_sleeves, looking_at_viewer, neck_ribbon, open_jacket, red_ribbon, smile, solo, yuigaoka_school_uniform, birthday, blush, pinafore_dress, medium_breasts, upper_body |
| 1 | 14 |  |  |  |  |  | 1girl, blue_jacket, grey_dress, looking_at_viewer, neck_ribbon, open_jacket, pinafore_dress, red_ribbon, solo, yuigaoka_school_uniform, blush, collared_shirt, long_sleeves, smile, simple_background, white_background, closed_mouth, cowboy_shot, white_shirt |
| 2 | 5 |  |  |  |  |  | 1girl, blue_jacket, brown_footwear, closed_mouth, full_body, grey_dress, loafers, long_sleeves, looking_at_viewer, neck_ribbon, open_jacket, pinafore_dress, red_ribbon, smile, solo, standing, white_background, white_shirt, white_socks, yuigaoka_school_uniform, collared_shirt, simple_background, white_bow, arms_behind_back, blush, kneehighs, leaning_forward |
| 3 | 6 |  |  |  |  |  | 1girl, birthday, looking_at_viewer, smile, solo, upper_body, blush, shiny |
| 4 | 10 |  |  |  |  |  | 1girl, birthday, looking_at_viewer, smile, solo, white_gloves, blush, medium_breasts, shiny, upper_body, sleeveless, white_dress, bubble, signature |
| 5 | 5 |  |  |  |  |  | red_bowtie, school_uniform, 1girl, collared_shirt, solo, upper_body, blush, closed_mouth, looking_at_viewer, short_sleeves, medium_breasts, skirt, white_shirt |
| 6 | 6 |  |  |  |  |  | 1girl, open_jacket, solo, full_body, looking_at_viewer, thigh_strap, white_footwear, white_jacket, dress, frills, skirt, detached_collar, simple_background, smile, white_background |
| 7 | 7 |  |  |  |  |  | 1girl, blush, cleavage, collarbone, looking_at_viewer, navel, solo, medium_breasts, white_background, white_bikini, cowboy_shot, parted_lips, simple_background, smile, stomach, bare_shoulders, blue_bikini, halterneck, large_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_jacket | grey_dress | long_sleeves | looking_at_viewer | neck_ribbon | open_jacket | red_ribbon | smile | solo | yuigaoka_school_uniform | birthday | blush | pinafore_dress | medium_breasts | upper_body | collared_shirt | simple_background | white_background | closed_mouth | cowboy_shot | white_shirt | brown_footwear | full_body | loafers | standing | white_socks | white_bow | arms_behind_back | kneehighs | leaning_forward | shiny | white_gloves | sleeveless | white_dress | bubble | signature | red_bowtie | school_uniform | short_sleeves | skirt | thigh_strap | white_footwear | white_jacket | dress | frills | detached_collar | cleavage | collarbone | navel | white_bikini | parted_lips | stomach | bare_shoulders | blue_bikini | halterneck | large_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:-------------|:---------------|:--------------------|:--------------|:--------------|:-------------|:--------|:-------|:--------------------------|:-----------|:--------|:-----------------|:-----------------|:-------------|:-----------------|:--------------------|:-------------------|:---------------|:--------------|:--------------|:-----------------|:------------|:----------|:-----------|:--------------|:------------|:-------------------|:------------|:------------------|:--------|:---------------|:-------------|:--------------|:---------|:------------|:-------------|:-----------------|:----------------|:--------|:--------------|:-----------------|:---------------|:--------|:---------|:------------------|:-----------|:-------------|:--------|:---------------|:--------------|:----------|:-----------------|:--------------|:-------------|:----------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | X | | | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | | X | | | | X | X | | X | X | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | | | | X | | | | X | X | | X | X | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | | X | | | | | X | | | X | | X | X | X | | | X | | X | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | | X | | X | | X | X | | | | | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | | | | X | | | | X | X | | | X | | X | | | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
thisisHJLee/vi_data_made3 | ---
license: apache-2.0
---
|
CVasNLPExperiments/textvqa_mini_validation_google_flan_t5_xxl_mode_OCR_VQA_Q_rices_ns_10 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0
num_bytes: 15008
num_examples: 10
download_size: 7112
dataset_size: 15008
configs:
- config_name: default
data_files:
- split: fewshot_0
path: data/fewshot_0-*
---
|
gu37/In-Shop-Clothes-Segmentation | ---
license: mit
---
|
open-llm-leaderboard/details_gagan3012__MetaModel | ---
pretty_name: Evaluation run of gagan3012/MetaModel
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gagan3012/MetaModel](https://huggingface.co/gagan3012/MetaModel) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gagan3012__MetaModel\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T14:09:43.780941](https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__MetaModel/blob/main/results_2024-01-04T14-09-43.780941.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6664380298886512,\n\
\ \"acc_stderr\": 0.031642195230944255,\n \"acc_norm\": 0.6671639222858992,\n\
\ \"acc_norm_stderr\": 0.03228745343467652,\n \"mc1\": 0.5691554467564259,\n\
\ \"mc1_stderr\": 0.01733527247533237,\n \"mc2\": 0.7184177934834866,\n\
\ \"mc2_stderr\": 0.014995634120330182\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n\
\ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.01325001257939344\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7132045409281019,\n\
\ \"acc_stderr\": 0.004513409114983828,\n \"acc_norm\": 0.8844851623182632,\n\
\ \"acc_norm_stderr\": 0.0031898897894046684\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.05021167315686781,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.05021167315686781\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.025751310131230234,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.025751310131230234\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.8129032258064516,\n \"acc_stderr\": 0.022185710092252252,\n\
\ \"acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.022185710092252252\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\
\ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097114,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097114\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009246,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009246\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \
\ \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.035865947385739734,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.035865947385739734\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39664804469273746,\n\
\ \"acc_stderr\": 0.016361354769822468,\n \"acc_norm\": 0.39664804469273746,\n\
\ \"acc_norm_stderr\": 0.016361354769822468\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694905,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694905\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02313237623454333,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02313237623454333\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49478487614080835,\n\
\ \"acc_stderr\": 0.012769541449652547,\n \"acc_norm\": 0.49478487614080835,\n\
\ \"acc_norm_stderr\": 0.012769541449652547\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.026303648393696036,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.026303648393696036\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.6813725490196079,\n \"acc_stderr\": 0.018850084696468712,\n \"\
acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.018850084696468712\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5691554467564259,\n\
\ \"mc1_stderr\": 0.01733527247533237,\n \"mc2\": 0.7184177934834866,\n\
\ \"mc2_stderr\": 0.014995634120330182\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370632\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6535253980288097,\n \
\ \"acc_stderr\": 0.013107179054313398\n }\n}\n```"
repo_url: https://huggingface.co/gagan3012/MetaModel
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|arc:challenge|25_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|gsm8k|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hellaswag|10_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-09-43.780941.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T14-09-43.780941.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- '**/details_harness|winogrande|5_2024-01-04T14-09-43.780941.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T14-09-43.780941.parquet'
- config_name: results
data_files:
- split: 2024_01_04T14_09_43.780941
path:
- results_2024-01-04T14-09-43.780941.parquet
- split: latest
path:
- results_2024-01-04T14-09-43.780941.parquet
---
# Dataset Card for Evaluation run of gagan3012/MetaModel
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gagan3012/MetaModel](https://huggingface.co/gagan3012/MetaModel) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gagan3012__MetaModel",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T14:09:43.780941](https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__MetaModel/blob/main/results_2024-01-04T14-09-43.780941.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6664380298886512,
"acc_stderr": 0.031642195230944255,
"acc_norm": 0.6671639222858992,
"acc_norm_stderr": 0.03228745343467652,
"mc1": 0.5691554467564259,
"mc1_stderr": 0.01733527247533237,
"mc2": 0.7184177934834866,
"mc2_stderr": 0.014995634120330182
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.01325001257939344
},
"harness|hellaswag|10": {
"acc": 0.7132045409281019,
"acc_stderr": 0.004513409114983828,
"acc_norm": 0.8844851623182632,
"acc_norm_stderr": 0.0031898897894046684
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686781,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686781
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5,
"acc_stderr": 0.025751310131230234,
"acc_norm": 0.5,
"acc_norm_stderr": 0.025751310131230234
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097114,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097114
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009246,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009246
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.035865947385739734,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.035865947385739734
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39664804469273746,
"acc_stderr": 0.016361354769822468,
"acc_norm": 0.39664804469273746,
"acc_norm_stderr": 0.016361354769822468
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694905,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694905
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02313237623454333,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02313237623454333
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49478487614080835,
"acc_stderr": 0.012769541449652547,
"acc_norm": 0.49478487614080835,
"acc_norm_stderr": 0.012769541449652547
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.75,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.75,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.018850084696468712,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.018850084696468712
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5691554467564259,
"mc1_stderr": 0.01733527247533237,
"mc2": 0.7184177934834866,
"mc2_stderr": 0.014995634120330182
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370632
},
"harness|gsm8k|5": {
"acc": 0.6535253980288097,
"acc_stderr": 0.013107179054313398
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/yanfei_genshin | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yanfei/煙緋/烟绯 (Genshin Impact)
This is the dataset of yanfei/煙緋/烟绯 (Genshin Impact), containing 500 images and their tags.
The core tags of this character are `pink_hair, long_hair, horns, green_eyes, hair_between_eyes, red_headwear, hat, breasts, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 1.06 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yanfei_genshin/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 878.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yanfei_genshin/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1337 | 1.80 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yanfei_genshin/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yanfei_genshin',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, bare_shoulders, black_bra, cleavage, crop_top, detached_sleeves, long_sleeves, looking_at_viewer, midriff, solo, cowboy_shot, navel, stomach, antlers, red_skirt, white_background, :d, open_mouth, simple_background, yellow_bow, standing, very_long_hair, holding, book |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, black_bra, cleavage, crop_top, detached_sleeves, long_sleeves, looking_at_viewer, midriff, smile, solo, antlers, upper_body, white_background, navel, simple_background, stomach |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, bloomers, cleavage, crop_top, detached_sleeves, long_sleeves, midriff, navel, smile, solo, standing, stomach, antlers, black_bra, cowboy_shot, black_shorts, looking_at_viewer, red_skirt, fire, simple_background, thighs, yellow_bow |
| 3 | 9 |  |  |  |  |  | 1girl, antlers, bare_shoulders, crop_top, detached_sleeves, long_sleeves, looking_at_viewer, midriff, solo, navel, open_mouth, stomach, cleavage, red_skirt, :d |
| 4 | 5 |  |  |  |  |  | 1girl, antlers, bare_shoulders, crop_top, detached_sleeves, long_sleeves, looking_at_viewer, open_mouth, solo, boots, midriff, red_footwear, skirt, :d, holding, staff |
| 5 | 6 |  |  |  |  |  | 1girl, antlers, bare_shoulders, detached_sleeves, looking_at_viewer, solo, upper_body, holding_book, long_sleeves, smile, closed_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_bra | cleavage | crop_top | detached_sleeves | long_sleeves | looking_at_viewer | midriff | solo | cowboy_shot | navel | stomach | antlers | red_skirt | white_background | :d | open_mouth | simple_background | yellow_bow | standing | very_long_hair | holding | book | smile | upper_body | bloomers | black_shorts | fire | thighs | boots | red_footwear | skirt | staff | holding_book | closed_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:------------|:-----------|:-----------|:-------------------|:---------------|:--------------------|:----------|:-------|:--------------|:--------|:----------|:----------|:------------|:-------------------|:-----|:-------------|:--------------------|:-------------|:-----------|:-----------------|:----------|:-------|:--------|:-------------|:-----------|:---------------|:-------|:---------|:--------|:---------------|:--------|:--------|:---------------|:---------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | X | X | X | | X | | | X | | | | | | X | X | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | X | X | X | | | | X | | X | X | X | X | | | | | | |
| 3 | 9 |  |  |  |  |  | X | X | | X | X | X | X | X | X | X | | X | X | X | X | | X | X | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | | X | X | X | X | X | X | | | | X | | | X | X | | | | | X | | | | | | | | X | X | X | X | | |
| 5 | 6 |  |  |  |  |  | X | X | | | | X | X | X | | X | | | | X | | | | | | | | | | | X | X | | | | | | | | | X | X |
|
huggingartists/elton-john | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/elton-john"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 1.422945 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/ec76d346c4c8b057169194c1781021fd.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/elton-john">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Elton John</div>
<a href="https://genius.com/artists/elton-john">
<div style="text-align: center; font-size: 14px;">@elton-john</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/elton-john).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/elton-john")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|1311| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/elton-john")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
ysharma/dummy123 | ---
license: mit
---
|
CVasNLPExperiments/OxfordPets_test_google_flan_t5_xxl_mode_A_T_SPECIFIC_ns_3669 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 1065643
num_examples: 3669
download_size: 193852
dataset_size: 1065643
---
# Dataset Card for "OxfordPets_test_google_flan_t5_xxl_mode_A_T_SPECIFIC_ns_3669"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
skytnt/japanese-lyric | ---
license: cc0-1.0
task_categories:
- text-generation
language:
- ja
tags:
- music
pretty_name: Japanese Lyric
size_categories:
- 10K<n<100K
--- |
HuggingFaceM4/ScienceQAImg_Modif | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: context
dtype: string
- name: label
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
'4': E
splits:
- name: train
num_bytes: 204229907.55288386
num_examples: 6218
- name: validation
num_bytes: 68613530.46875736
num_examples: 2097
- name: test
num_bytes: 65108877.472058475
num_examples: 2017
download_size: 661814327
dataset_size: 337952315.4936997
---
# Dataset Card for "ScienceQAImg_Modif"
This dataset contains the [ScienceQA benchmark](https://arxiv.org/pdf/2209.09513.pdf) where only examples with an image are kept, and where we formatted the prompt. |
csaybar/CloudSEN12-nolabel | ---
license: cc-by-nc-4.0
---
# **CloudSEN12 NOLABEL**
## **A Benchmark Dataset for Cloud Semantic Understanding**

CloudSEN12 is a LARGE dataset (~1 TB) for cloud semantic understanding that consists of 49,400 image patches (IP) that are
evenly spread throughout all continents except Antarctica. Each IP covers 5090 x 5090 meters and contains data from Sentinel-2
levels 1C and 2A, hand-crafted annotations of thick and thin clouds and cloud shadows, Sentinel-1 Synthetic Aperture Radar (SAR),
digital elevation model, surface water occurrence, land cover classes, and cloud mask results from six cutting-edge
cloud detection algorithms.
CloudSEN12 is designed to support both weakly and self-/semi-supervised learning strategies by including three distinct forms of
hand-crafted labeling data: high-quality, scribble and no-annotation. For more details on how we created the dataset see our
paper.
Ready to start using **[CloudSEN12](https://cloudsen12.github.io/)**?
**[Download Dataset](https://cloudsen12.github.io/download.html)**
**[Paper - Scientific Data](https://www.nature.com/articles/s41597-022-01878-2)**
**[Inference on a new S2 image](https://colab.research.google.com/github/cloudsen12/examples/blob/master/example02.ipynb)**
**[Enter to cloudApp](https://github.com/cloudsen12/CloudApp)**
**[CloudSEN12 in Google Earth Engine](https://gee-community-catalog.org/projects/cloudsen12/)**
<br>
### **Description**
<br>
| File | Name | Scale | Wavelength | Description | Datatype |
|---------------|-----------------|--------|------------------------------|------------------------------------------------------------------------------------------------------|----------|
| L1C_ & L2A_ | B1 | 0.0001 | 443.9nm (S2A) / 442.3nm (S2B)| Aerosols. | np.int16 |
| | B2 | 0.0001 | 496.6nm (S2A) / 492.1nm (S2B)| Blue. | np.int16 |
| | B3 | 0.0001 | 560nm (S2A) / 559nm (S2B) | Green. | np.int16 |
| | B4 | 0.0001 | 664.5nm (S2A) / 665nm (S2B) | Red. | np.int16 |
| | B5 | 0.0001 | 703.9nm (S2A) / 703.8nm (S2B)| Red Edge 1. | np.int16 |
| | B6 | 0.0001 | 740.2nm (S2A) / 739.1nm (S2B)| Red Edge 2. | np.int16 |
| | B7 | 0.0001 | 782.5nm (S2A) / 779.7nm (S2B)| Red Edge 3. | np.int16 |
| | B8 | 0.0001 | 835.1nm (S2A) / 833nm (S2B) | NIR. | np.int16 |
| | B8A | 0.0001 | 864.8nm (S2A) / 864nm (S2B) | Red Edge 4. | np.int16 |
| | B9 | 0.0001 | 945nm (S2A) / 943.2nm (S2B) | Water vapor. | np.int16 |
| | B11 | 0.0001 | 1613.7nm (S2A) / 1610.4nm (S2B)| SWIR 1. | np.int16 |
| | B12 | 0.0001 | 2202.4nm (S2A) / 2185.7nm (S2B)| SWIR 2. | np.int16 |
| L1C_ | B10 | 0.0001 | 1373.5nm (S2A) / 1376.9nm (S2B)| Cirrus. | np.int16 |
| L2A_ | AOT | 0.001 | - | Aerosol Optical Thickness. | np.int16 |
| | WVP | 0.001 | - | Water Vapor Pressure. | np.int16 |
| | TCI_R | 1 | - | True Color Image, Red. | np.int16 |
| | TCI_G | 1 | - | True Color Image, Green. | np.int16 |
| | TCI_B | 1 | - | True Color Image, Blue. | np.int16 |
| S1_ | VV | 1 | 5.405GHz | Dual-band cross-polarization, vertical transmit/horizontal receive. |np.float32|
| | VH | 1 | 5.405GHz | Single co-polarization, vertical transmit/vertical receive. |np.float32|
| | angle | 1 | - | Incidence angle generated by interpolating the ‘incidenceAngle’ property. |np.float32|
| EXTRA_ | CDI | 0.0001 | - | Cloud Displacement Index. | np.int16 |
| | Shwdirection | 0.01 | - | Azimuth. Values range from 0°- 360°. | np.int16 |
| | elevation | 1 | - | Elevation in meters. Obtained from MERIT Hydro datasets. | np.int16 |
| | ocurrence | 1 | - | JRC Global Surface Water. The frequency with which water was present. | np.int16 |
| | LC100 | 1 | - | Copernicus land cover product. CGLS-LC100 Collection 3. | np.int16 |
| | LC10 | 1 | - | ESA WorldCover 10m v100 product. | np.int16 |
| LABEL_ | fmask | 1 | - | Fmask4.0 cloud masking. | np.int16 |
| | QA60 | 1 | - | SEN2 Level-1C cloud mask. | np.int8 |
| | s2cloudless | 1 | - | sen2cloudless results. | np.int8 |
| | sen2cor | 1 | - | Scene Classification band. Obtained from SEN2 level 2A. | np.int8 |
| | cd_fcnn_rgbi | 1 | - | López-Puigdollers et al. results based on RGBI bands. | np.int8 |
| |cd_fcnn_rgbi_swir| 1 | - | López-Puigdollers et al. results based on RGBISWIR bands. | np.int8 |
| | kappamask_L1C | 1 | - | KappaMask results using SEN2 level L1C as input. | np.int8 |
| | kappamask_L2A | 1 | - | KappaMask results using SEN2 level L2A as input. | np.int8 |
| | manual_hq | 1 | | High-quality pixel-wise manual annotation. | np.int8 |
| | manual_sc | 1 | | Scribble manual annotation. | np.int8 |
<br>
### **Label Description**
| **CloudSEN12** | **KappaMask** | **Sen2Cor** | **Fmask** | **s2cloudless** | **CD-FCNN** | **QA60** |
|------------------|------------------|-------------------------|-----------------|-----------------------|---------------------|--------------------|
| 0 Clear | 1 Clear | 4 Vegetation | 0 Clear land | 0 Clear | 0 Clear | 0 Clear |
| | | 2 Dark area pixels | 1 Clear water | | | |
| | | 5 Bare Soils | 3 Snow | | | |
| | | 6 Water | | | | |
| | | 11 Snow | | | | |
| 1 Thick cloud | 4 Cloud | 8 Cloud medium probability | 4 Cloud | 1 Cloud | 1 Cloud | 1024 Opaque cloud |
| | | 9 Cloud high probability | | | | |
| 2 Thin cloud | 3 Semi-transparent cloud | 10 Thin cirrus | | | | 2048 Cirrus cloud |
| 3 Cloud shadow | 2 Cloud shadow | 3 Cloud shadows | 2 Cloud shadow | | | |
<br>
### **np.memmap shape information**
<br>
**cloudfree (0\%) shape: (5880, 512, 512)**
<br>
**almostclear (0-25 \%) shape: (5880, 512, 512)**
<br>
**lowcloudy (25-45 \%) shape: (5880, 512, 512)**
<br>
**midcloudy (45-65 \%) shape: (5880, 512, 512)**
<br>
**cloudy (65 > \%) shape: (5880, 512, 512)**
<br>
### **Example**
<br>
```py
import numpy as np
# Read high-quality train
cloudfree_shape = (5880, 512, 512)
B4X = np.memmap('cloudfree/L1C_B04.dat', dtype='int16', mode='r', shape=cloudfree_shape)
y = np.memmap('cloudfree/manual_hq.dat', dtype='int8', mode='r', shape=cloudfree_shape)
# Read high-quality val
almostclear_shape = (5880, 512, 512)
B4X = np.memmap('almostclear/L1C_B04.dat', dtype='int16', mode='r', shape=almostclear_shape)
y = np.memmap('almostclear/kappamask_L1C.dat', dtype='int8', mode='r', shape=almostclear_shape)
# Read high-quality test
midcloudy_shape = (5880, 512, 512)
B4X = np.memmap('midcloudy/L1C_B04.dat', dtype='int16', mode='r', shape=midcloudy_shape)
y = np.memmap('midcloudy/kappamask_L1C.dat', dtype='int8', mode='r', shape=midcloudy_shape)
```
<br>
This work has been partially supported by the Spanish Ministry of Science and Innovation project
PID2019-109026RB-I00 (MINECO-ERDF) and the Austrian Space Applications Programme within the
**[SemantiX project](https://austria-in-space.at/en/projects/2019/semantix.php)**.
|
communityai/communityai_apt-instruct-code-micro-70k | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 291560981.6281767
num_examples: 70000
download_size: 129436280
dataset_size: 291560981.6281767
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yjernite/prof_report__dalle-2__sd_21__24 | ---
dataset_info:
features:
- name: cluster_id
dtype: int64
- name: cluster_size
dtype: int64
- name: img_ids
sequence: int64
- name: img_cluster_scores
sequence: float64
splits:
- name: paralegal
num_bytes: 3624
num_examples: 11
- name: bartender
num_bytes: 3576
num_examples: 9
- name: facilities_manager
num_bytes: 3576
num_examples: 9
- name: accountant
num_bytes: 3480
num_examples: 5
- name: graphic_designer
num_bytes: 3600
num_examples: 10
- name: network_administrator
num_bytes: 3648
num_examples: 12
- name: financial_manager
num_bytes: 3504
num_examples: 6
- name: baker
num_bytes: 3600
num_examples: 10
- name: security_guard
num_bytes: 3528
num_examples: 7
- name: artist
num_bytes: 3696
num_examples: 14
- name: author
num_bytes: 3648
num_examples: 12
- name: printing_press_operator
num_bytes: 3504
num_examples: 6
- name: public_relations_specialist
num_bytes: 3552
num_examples: 8
- name: sheet_metal_worker
num_bytes: 3504
num_examples: 6
- name: clergy
num_bytes: 3600
num_examples: 10
- name: payroll_clerk
num_bytes: 3624
num_examples: 11
- name: teller
num_bytes: 3600
num_examples: 10
- name: real_estate_broker
num_bytes: 3480
num_examples: 5
- name: customer_service_representative
num_bytes: 3600
num_examples: 10
- name: painter
num_bytes: 3696
num_examples: 14
- name: tractor_operator
num_bytes: 3552
num_examples: 8
- name: dental_hygienist
num_bytes: 3528
num_examples: 7
- name: industrial_engineer
num_bytes: 3576
num_examples: 9
- name: electrician
num_bytes: 3576
num_examples: 9
- name: head_cook
num_bytes: 3576
num_examples: 9
- name: health_technician
num_bytes: 3576
num_examples: 9
- name: carpet_installer
num_bytes: 3624
num_examples: 11
- name: purchasing_agent
num_bytes: 3552
num_examples: 8
- name: supervisor
num_bytes: 3552
num_examples: 8
- name: civil_engineer
num_bytes: 3504
num_examples: 6
- name: lawyer
num_bytes: 3576
num_examples: 9
- name: language_pathologist
num_bytes: 3576
num_examples: 9
- name: ceo
num_bytes: 3480
num_examples: 5
- name: computer_support_specialist
num_bytes: 3480
num_examples: 5
- name: postal_worker
num_bytes: 3576
num_examples: 9
- name: mechanical_engineer
num_bytes: 3576
num_examples: 9
- name: nursing_assistant
num_bytes: 3528
num_examples: 7
- name: dentist
num_bytes: 3600
num_examples: 10
- name: tutor
num_bytes: 3576
num_examples: 9
- name: butcher
num_bytes: 3552
num_examples: 8
- name: insurance_agent
num_bytes: 3552
num_examples: 8
- name: courier
num_bytes: 3552
num_examples: 8
- name: computer_programmer
num_bytes: 3528
num_examples: 7
- name: truck_driver
num_bytes: 3480
num_examples: 5
- name: mechanic
num_bytes: 3576
num_examples: 9
- name: marketing_manager
num_bytes: 3528
num_examples: 7
- name: sales_manager
num_bytes: 3480
num_examples: 5
- name: correctional_officer
num_bytes: 3528
num_examples: 7
- name: manager
num_bytes: 3504
num_examples: 6
- name: underwriter
num_bytes: 3528
num_examples: 7
- name: executive_assistant
num_bytes: 3528
num_examples: 7
- name: designer
num_bytes: 3576
num_examples: 9
- name: groundskeeper
num_bytes: 3624
num_examples: 11
- name: mental_health_counselor
num_bytes: 3600
num_examples: 10
- name: aerospace_engineer
num_bytes: 3552
num_examples: 8
- name: taxi_driver
num_bytes: 3552
num_examples: 8
- name: nurse
num_bytes: 3504
num_examples: 6
- name: data_entry_keyer
num_bytes: 3624
num_examples: 11
- name: musician
num_bytes: 3624
num_examples: 11
- name: event_planner
num_bytes: 3696
num_examples: 14
- name: writer
num_bytes: 3576
num_examples: 9
- name: cook
num_bytes: 3648
num_examples: 12
- name: welder
num_bytes: 3552
num_examples: 8
- name: producer
num_bytes: 3648
num_examples: 12
- name: hairdresser
num_bytes: 3672
num_examples: 13
- name: farmer
num_bytes: 3528
num_examples: 7
- name: construction_worker
num_bytes: 3576
num_examples: 9
- name: air_conditioning_installer
num_bytes: 3504
num_examples: 6
- name: electrical_engineer
num_bytes: 3504
num_examples: 6
- name: occupational_therapist
num_bytes: 3552
num_examples: 8
- name: career_counselor
num_bytes: 3528
num_examples: 7
- name: interior_designer
num_bytes: 3648
num_examples: 12
- name: jailer
num_bytes: 3528
num_examples: 7
- name: office_clerk
num_bytes: 3504
num_examples: 6
- name: market_research_analyst
num_bytes: 3576
num_examples: 9
- name: laboratory_technician
num_bytes: 3576
num_examples: 9
- name: social_assistant
num_bytes: 3552
num_examples: 8
- name: medical_records_specialist
num_bytes: 3624
num_examples: 11
- name: machinery_mechanic
num_bytes: 3504
num_examples: 6
- name: police_officer
num_bytes: 3552
num_examples: 8
- name: software_developer
num_bytes: 3480
num_examples: 5
- name: clerk
num_bytes: 3480
num_examples: 5
- name: salesperson
num_bytes: 3552
num_examples: 8
- name: social_worker
num_bytes: 3672
num_examples: 13
- name: director
num_bytes: 3504
num_examples: 6
- name: fast_food_worker
num_bytes: 3648
num_examples: 12
- name: singer
num_bytes: 3696
num_examples: 14
- name: metal_worker
num_bytes: 3576
num_examples: 9
- name: cleaner
num_bytes: 3648
num_examples: 12
- name: computer_systems_analyst
num_bytes: 3600
num_examples: 10
- name: dental_assistant
num_bytes: 3504
num_examples: 6
- name: psychologist
num_bytes: 3576
num_examples: 9
- name: machinist
num_bytes: 3504
num_examples: 6
- name: therapist
num_bytes: 3504
num_examples: 6
- name: veterinarian
num_bytes: 3528
num_examples: 7
- name: teacher
num_bytes: 3576
num_examples: 9
- name: architect
num_bytes: 3552
num_examples: 8
- name: office_worker
num_bytes: 3552
num_examples: 8
- name: drywall_installer
num_bytes: 3552
num_examples: 8
- name: nutritionist
num_bytes: 3552
num_examples: 8
- name: librarian
num_bytes: 3576
num_examples: 9
- name: childcare_worker
num_bytes: 3576
num_examples: 9
- name: school_bus_driver
num_bytes: 3504
num_examples: 6
- name: file_clerk
num_bytes: 3504
num_examples: 6
- name: logistician
num_bytes: 3528
num_examples: 7
- name: scientist
num_bytes: 3528
num_examples: 7
- name: teaching_assistant
num_bytes: 3576
num_examples: 9
- name: radiologic_technician
num_bytes: 3504
num_examples: 6
- name: manicurist
num_bytes: 3600
num_examples: 10
- name: community_manager
num_bytes: 3528
num_examples: 7
- name: carpenter
num_bytes: 3600
num_examples: 10
- name: claims_appraiser
num_bytes: 3552
num_examples: 8
- name: dispatcher
num_bytes: 3576
num_examples: 9
- name: cashier
num_bytes: 3600
num_examples: 10
- name: roofer
num_bytes: 3504
num_examples: 6
- name: photographer
num_bytes: 3600
num_examples: 10
- name: detective
num_bytes: 3576
num_examples: 9
- name: financial_advisor
num_bytes: 3504
num_examples: 6
- name: wholesale_buyer
num_bytes: 3576
num_examples: 9
- name: it_specialist
num_bytes: 3456
num_examples: 4
- name: pharmacy_technician
num_bytes: 3600
num_examples: 10
- name: engineer
num_bytes: 3456
num_examples: 4
- name: mover
num_bytes: 3696
num_examples: 14
- name: plane_mechanic
num_bytes: 3504
num_examples: 6
- name: interviewer
num_bytes: 3552
num_examples: 8
- name: massage_therapist
num_bytes: 3624
num_examples: 11
- name: dishwasher
num_bytes: 3624
num_examples: 11
- name: fitness_instructor
num_bytes: 3576
num_examples: 9
- name: credit_counselor
num_bytes: 3552
num_examples: 8
- name: stocker
num_bytes: 3672
num_examples: 13
- name: pharmacist
num_bytes: 3504
num_examples: 6
- name: doctor
num_bytes: 3552
num_examples: 8
- name: compliance_officer
num_bytes: 3552
num_examples: 8
- name: aide
num_bytes: 3600
num_examples: 10
- name: bus_driver
num_bytes: 3552
num_examples: 8
- name: financial_analyst
num_bytes: 3504
num_examples: 6
- name: receptionist
num_bytes: 3624
num_examples: 11
- name: janitor
num_bytes: 3576
num_examples: 9
- name: plumber
num_bytes: 3528
num_examples: 7
- name: physical_therapist
num_bytes: 3528
num_examples: 7
- name: inventory_clerk
num_bytes: 3576
num_examples: 9
- name: firefighter
num_bytes: 3528
num_examples: 7
- name: coach
num_bytes: 3528
num_examples: 7
- name: maid
num_bytes: 3528
num_examples: 7
- name: pilot
num_bytes: 3504
num_examples: 6
- name: repair_worker
num_bytes: 3576
num_examples: 9
download_size: 867106
dataset_size: 520104
---
# Dataset Card for "prof_report__dalle-2__sd_21__24"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hisaishi_kanade_soundeuphonium | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Hisaishi Kanade/久石奏 (Sound! Euphonium)
This is the dataset of Hisaishi Kanade/久石奏 (Sound! Euphonium), containing 182 images and their tags.
The core tags of this character are `short_hair, black_hair, bow, hair_bow, red_eyes, red_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 182 | 122.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hisaishi_kanade_soundeuphonium/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 182 | 122.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hisaishi_kanade_soundeuphonium/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 359 | 220.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hisaishi_kanade_soundeuphonium/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hisaishi_kanade_soundeuphonium',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, blue_sailor_collar, blush, green_neckerchief, kitauji_high_school_uniform, serafuku, white_shirt, indoors, solo, closed_mouth, looking_at_viewer, short_sleeves, chalkboard, holding_instrument |
| 1 | 8 |  |  |  |  |  | 1girl, blue_sailor_collar, blue_skirt, blush, chair, classroom, closed_mouth, indoors, kitauji_high_school_uniform, serafuku, sitting, white_shirt, desk, holding_instrument, pleated_skirt, short_sleeves, brown_eyes, green_neckerchief, chalkboard, looking_at_viewer, solo_focus, smile |
| 2 | 5 |  |  |  |  |  | 1girl, blue_sailor_collar, blue_skirt, blurry, blush, chair, chalkboard, classroom, desk, green_neckerchief, indoors, instrument, kitauji_high_school_uniform, pleated_skirt, serafuku, short_sleeves, solo, standing, white_shirt, open_mouth, bag, looking_at_viewer, smile, sweatdrop |
| 3 | 8 |  |  |  |  |  | 1girl, blush, brown_shirt, closed_mouth, kitauji_high_school_uniform, serafuku, solo, white_sailor_collar, indoors, smile, green_neckerchief, blurry_background, long_sleeves, looking_at_viewer |
| 4 | 6 |  |  |  |  |  | 1girl, blurry_background, blush, brown_shirt, brown_skirt, green_neckerchief, kitauji_high_school_uniform, long_sleeves, pleated_skirt, serafuku, smile, solo, standing, white_sailor_collar, closed_mouth, indoors |
| 5 | 5 |  |  |  |  |  | 1girl, blue_sailor_collar, blue_skirt, blush, closed_mouth, green_neckerchief, hand_up, indoors, kitauji_high_school_uniform, pleated_skirt, school_bag, serafuku, short_sleeves, solo, standing, white_shirt, window, black_bag, brown_hair, looking_at_viewer, mole_under_eye, smile, blurry_background, brown_eyes, hallway |
| 6 | 8 |  |  |  |  |  | 1girl, blue_sailor_collar, blurry_background, blush, kitauji_high_school_uniform, outdoors, rain, serafuku, solo, white_shirt, open_mouth, green_neckerchief, parted_lips, brown_eyes, building, wet_hair |
| 7 | 9 |  |  |  |  |  | 1girl, solo, green_jacket, blush, closed_mouth, blurry_background, looking_at_viewer, holding_instrument, long_sleeves, outdoors, shirt, track_jacket |
| 8 | 17 |  |  |  |  |  | 1girl, blush, yellow_headwear, solo, closed_mouth, holding_instrument, short_sleeves, band_uniform, orange_headwear, shirt, red_gloves, blurry_background, hat_feather, grey_background, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_sailor_collar | blush | green_neckerchief | kitauji_high_school_uniform | serafuku | white_shirt | indoors | solo | closed_mouth | looking_at_viewer | short_sleeves | chalkboard | holding_instrument | blue_skirt | chair | classroom | sitting | desk | pleated_skirt | brown_eyes | solo_focus | smile | blurry | instrument | standing | open_mouth | bag | sweatdrop | brown_shirt | white_sailor_collar | blurry_background | long_sleeves | brown_skirt | hand_up | school_bag | window | black_bag | brown_hair | mole_under_eye | hallway | outdoors | rain | parted_lips | building | wet_hair | green_jacket | shirt | track_jacket | yellow_headwear | band_uniform | orange_headwear | red_gloves | hat_feather | grey_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------------|:--------|:--------------------|:------------------------------|:-----------|:--------------|:----------|:-------|:---------------|:--------------------|:----------------|:-------------|:---------------------|:-------------|:--------|:------------|:----------|:-------|:----------------|:-------------|:-------------|:--------|:---------|:-------------|:-----------|:-------------|:------|:------------|:--------------|:----------------------|:--------------------|:---------------|:--------------|:----------|:-------------|:---------|:------------|:-------------|:-----------------|:----------|:-----------|:-------|:--------------|:-----------|:-----------|:---------------|:--------|:---------------|:------------------|:---------------|:------------------|:-------------|:--------------|:------------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | X | X | | X | X | X | | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | X | X | X | X | | X | X | X | X | | | | | | | | | | | | X | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | | X | X | X | X | | X | X | X | | | | | | | | | | X | | | X | | | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | X | | | | | X | X | | X | | | X | | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | | | | | | | | | | | X | | | | | | X | | | | | X | | | | | | | | | | X | X | X | X | X | | | | | | | | | |
| 7 | 9 |  |  |  |  |  | X | | X | | | | | | X | X | X | | | X | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | X | | | | | X | X | X | | | | | | |
| 8 | 17 |  |  |  |  |  | X | | X | | | | | | X | X | X | X | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X |
|
autoevaluate/autoeval-eval-phpthinh__examplei-all-929d48-1748861032 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- phpthinh/examplei
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-7b1
metrics: ['f1']
dataset_name: phpthinh/examplei
dataset_config: all
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-7b1
* Dataset: phpthinh/examplei
* Config: all
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@phpthinh](https://huggingface.co/phpthinh) for evaluating this model. |
malteos/wikinews-tmp2 | ---
dataset_info:
- config_name: de
features:
- name: language
dtype: string
- name: wiki_page_id
dtype: string
- name: wiki_revision_id
dtype: string
- name: revision_timestamp
dtype: timestamp[us, tz=UTC]
- name: revision_year
dtype: uint16
- name: revision_month
dtype: uint16
- name: article_timestamp
dtype: timestamp[us, tz=UTC]
- name: article_year
dtype: uint16
- name: article_month
dtype: uint16
- name: url
dtype: string
- name: title
dtype: string
- name: raw_text
dtype: string
- name: cleaned_text
dtype: string
- name: categories
sequence: string
- name: sources
sequence: string
- name: dump
dtype: string
splits:
- name: 2004_q4_12
num_bytes: 1060779
num_examples: 251
- name: 2005_q1_01
num_bytes: 402111
num_examples: 99
- name: 2005_q1_02
num_bytes: 602415
num_examples: 162
- name: 2005_q1_03
num_bytes: 845392
num_examples: 195
- name: 2005_q3_08
num_bytes: 1392393
num_examples: 360
- name: 2005_q2_04
num_bytes: 754328
num_examples: 186
- name: 2005_q2_05
num_bytes: 750409
num_examples: 179
- name: 2005_q3_07
num_bytes: 1380652
num_examples: 334
- name: 2005_q2_06
num_bytes: 993773
num_examples: 257
- name: 2005_q4_10
num_bytes: 1716394
num_examples: 410
- name: 2005_q4_11
num_bytes: 934471
num_examples: 230
- name: 2007_q1_03
num_bytes: 901035
num_examples: 175
- name: 2005_q3_09
num_bytes: 1659850
num_examples: 392
- name: 2004_q3_08
num_bytes: 7316
num_examples: 2
- name: 2005_q4_12
num_bytes: 1086986
num_examples: 268
- name: 2006_q1_01
num_bytes: 1209718
num_examples: 279
- name: 2006_q1_02
num_bytes: 819639
num_examples: 194
- name: 2006_q1_03
num_bytes: 1074845
num_examples: 247
- name: 2006_q2_06
num_bytes: 1170821
num_examples: 263
- name: 2006_q2_04
num_bytes: 978701
num_examples: 221
- name: 2006_q2_05
num_bytes: 1136732
num_examples: 271
- name: 2006_q3_07
num_bytes: 1161245
num_examples: 249
- name: 2006_q3_08
num_bytes: 1275797
num_examples: 241
- name: 2006_q3_09
num_bytes: 873844
num_examples: 157
- name: 2006_q4_10
num_bytes: 913674
num_examples: 206
- name: 2006_q4_11
num_bytes: 986117
num_examples: 193
- name: 2006_q4_12
num_bytes: 851848
num_examples: 183
- name: 2007_q1_02
num_bytes: 856040
num_examples: 163
- name: 2007_q1_01
num_bytes: 850607
num_examples: 181
- name: 2007_q2_06
num_bytes: 534063
num_examples: 111
- name: 2007_q2_04
num_bytes: 945588
num_examples: 160
- name: 2007_q2_05
num_bytes: 615775
num_examples: 124
- name: 2007_q3_07
num_bytes: 447023
num_examples: 111
- name: 2007_q3_08
num_bytes: 556296
num_examples: 125
- name: 2007_q3_09
num_bytes: 410399
num_examples: 89
- name: 2007_q4_10
num_bytes: 632163
num_examples: 110
- name: 2007_q4_11
num_bytes: 570752
num_examples: 105
- name: 2007_q4_12
num_bytes: 588606
num_examples: 128
- name: 2008_q1_01
num_bytes: 637010
num_examples: 109
- name: 2008_q1_02
num_bytes: 887579
num_examples: 170
- name: 2008_q1_03
num_bytes: 908286
num_examples: 143
- name: 2008_q2_04
num_bytes: 671330
num_examples: 110
- name: 2008_q2_05
num_bytes: 1051035
num_examples: 149
- name: 2008_q2_06
num_bytes: 795385
num_examples: 143
- name: 2008_q3_07
num_bytes: 439837
num_examples: 88
- name: 2008_q3_08
num_bytes: 596690
num_examples: 129
- name: 2008_q3_09
num_bytes: 641620
num_examples: 124
- name: 2008_q4_10
num_bytes: 553135
num_examples: 111
- name: 2008_q4_11
num_bytes: 526644
num_examples: 89
- name: 2008_q4_12
num_bytes: 573483
num_examples: 101
- name: 2009_q1_01
num_bytes: 677937
num_examples: 103
- name: 2009_q1_02
num_bytes: 655507
num_examples: 97
- name: 2009_q1_03
num_bytes: 487924
num_examples: 92
- name: 2009_q2_04
num_bytes: 208472
num_examples: 33
- name: 2009_q2_05
num_bytes: 407352
num_examples: 59
- name: 2009_q2_06
num_bytes: 294088
num_examples: 53
- name: 2009_q3_07
num_bytes: 254948
num_examples: 43
- name: 2009_q3_08
num_bytes: 156550
num_examples: 27
- name: 2009_q3_09
num_bytes: 340243
num_examples: 67
- name: 2009_q4_10
num_bytes: 545111
num_examples: 82
- name: 2009_q4_11
num_bytes: 231081
num_examples: 50
- name: 2009_q4_12
num_bytes: 381351
num_examples: 70
- name: 2010_q1_01
num_bytes: 559657
num_examples: 111
- name: 2010_q1_02
num_bytes: 673175
num_examples: 114
- name: 2010_q1_03
num_bytes: 578992
num_examples: 85
- name: 2010_q2_04
num_bytes: 535384
num_examples: 89
- name: 2010_q2_05
num_bytes: 419850
num_examples: 63
- name: 2010_q2_06
num_bytes: 500243
num_examples: 52
- name: 2010_q3_07
num_bytes: 245375
num_examples: 27
- name: 2010_q3_08
num_bytes: 248039
num_examples: 49
- name: 2010_q3_09
num_bytes: 141833
num_examples: 32
- name: 2010_q4_10
num_bytes: 687360
num_examples: 113
- name: 2010_q4_11
num_bytes: 606526
num_examples: 108
- name: 2010_q4_12
num_bytes: 752214
num_examples: 118
- name: 2011_q1_01
num_bytes: 643644
num_examples: 104
- name: 2011_q1_02
num_bytes: 647805
num_examples: 105
- name: 2011_q1_03
num_bytes: 753215
num_examples: 126
- name: 2011_q2_04
num_bytes: 787069
num_examples: 133
- name: 2011_q2_05
num_bytes: 689293
num_examples: 113
- name: 2011_q2_06
num_bytes: 492787
num_examples: 80
- name: 2011_q3_07
num_bytes: 359105
num_examples: 71
- name: 2011_q3_08
num_bytes: 324861
num_examples: 59
- name: 2011_q3_09
num_bytes: 385512
num_examples: 56
- name: 2011_q4_10
num_bytes: 404317
num_examples: 62
- name: 2011_q4_11
num_bytes: 413075
num_examples: 74
- name: 2011_q4_12
num_bytes: 373463
num_examples: 73
- name: 2012_q1_01
num_bytes: 326678
num_examples: 68
- name: 2012_q1_02
num_bytes: 398643
num_examples: 70
- name: 2012_q1_03
num_bytes: 630382
num_examples: 83
- name: 2012_q2_04
num_bytes: 292391
num_examples: 49
- name: 2012_q2_05
num_bytes: 227520
num_examples: 37
- name: 2012_q2_06
num_bytes: 274256
num_examples: 53
- name: 2012_q3_07
num_bytes: 483437
num_examples: 84
- name: 2012_q3_08
num_bytes: 182498
num_examples: 33
- name: 2012_q3_09
num_bytes: 226794
num_examples: 44
- name: 2012_q4_10
num_bytes: 180811
num_examples: 34
- name: 2012_q4_11
num_bytes: 198048
num_examples: 31
- name: 2012_q4_12
num_bytes: 167008
num_examples: 29
- name: 2013_q1_01
num_bytes: 210524
num_examples: 31
- name: no_date
num_bytes: 71272
num_examples: 91
- name: 2013_q1_02
num_bytes: 467978
num_examples: 62
- name: 2013_q1_03
num_bytes: 180397
num_examples: 26
- name: 2013_q2_04
num_bytes: 171961
num_examples: 25
- name: 2013_q2_05
num_bytes: 33195
num_examples: 8
- name: 2013_q2_06
num_bytes: 88946
num_examples: 18
- name: 2013_q3_07
num_bytes: 125554
num_examples: 18
- name: 2013_q3_09
num_bytes: 109708
num_examples: 15
- name: 2013_q3_08
num_bytes: 168271
num_examples: 22
- name: 2013_q4_10
num_bytes: 203074
num_examples: 24
- name: 2013_q4_11
num_bytes: 124006
num_examples: 16
- name: 2013_q4_12
num_bytes: 130484
num_examples: 17
- name: 2014_q1_01
num_bytes: 179227
num_examples: 21
- name: 2014_q1_02
num_bytes: 68916
num_examples: 9
- name: 2014_q1_03
num_bytes: 90720
num_examples: 10
- name: 2014_q2_04
num_bytes: 97483
num_examples: 11
- name: 2014_q2_05
num_bytes: 50559
num_examples: 7
- name: 2014_q2_06
num_bytes: 65725
num_examples: 12
- name: 2014_q3_07
num_bytes: 120677
num_examples: 17
- name: 2014_q3_08
num_bytes: 330280
num_examples: 44
- name: 2014_q3_09
num_bytes: 237154
num_examples: 33
- name: 2014_q4_10
num_bytes: 221648
num_examples: 31
- name: 2014_q4_11
num_bytes: 42332
num_examples: 8
- name: 2014_q4_12
num_bytes: 102919
num_examples: 21
- name: 2015_q1_01
num_bytes: 168398
num_examples: 32
- name: 2015_q1_02
num_bytes: 103296
num_examples: 16
- name: 2015_q1_03
num_bytes: 97980
num_examples: 20
- name: 2015_q2_04
num_bytes: 116038
num_examples: 25
- name: 2015_q2_06
num_bytes: 45288
num_examples: 8
- name: 2015_q2_05
num_bytes: 128943
num_examples: 15
- name: 2015_q3_07
num_bytes: 97194
num_examples: 19
- name: 2015_q3_08
num_bytes: 36542
num_examples: 6
- name: 2015_q3_09
num_bytes: 42346
num_examples: 10
- name: 2015_q4_10
num_bytes: 8216
num_examples: 3
- name: 2015_q4_11
num_bytes: 46792
num_examples: 10
- name: 2015_q4_12
num_bytes: 62521
num_examples: 14
- name: 2016_q1_02
num_bytes: 68608
num_examples: 15
- name: 2016_q1_01
num_bytes: 104689
num_examples: 20
- name: 2016_q1_03
num_bytes: 118623
num_examples: 23
- name: 2016_q2_05
num_bytes: 54263
num_examples: 12
- name: 2016_q2_04
num_bytes: 75443
num_examples: 12
- name: 2016_q2_06
num_bytes: 79838
num_examples: 17
- name: 2016_q3_07
num_bytes: 86013
num_examples: 15
- name: 2016_q3_08
num_bytes: 116878
num_examples: 21
- name: 2016_q3_09
num_bytes: 154236
num_examples: 30
- name: 2016_q4_10
num_bytes: 53790
num_examples: 12
- name: 2016_q4_12
num_bytes: 112099
num_examples: 20
- name: 2016_q4_11
num_bytes: 59706
num_examples: 12
- name: 2017_q1_01
num_bytes: 122131
num_examples: 28
- name: 2017_q1_02
num_bytes: 79897
num_examples: 16
- name: 2017_q1_03
num_bytes: 227358
num_examples: 41
- name: 2017_q2_04
num_bytes: 266740
num_examples: 48
- name: 2017_q2_05
num_bytes: 152658
num_examples: 27
- name: 2017_q2_06
num_bytes: 142876
num_examples: 24
- name: 2017_q3_07
num_bytes: 235705
num_examples: 41
- name: 2017_q3_08
num_bytes: 64080
num_examples: 13
- name: 2017_q3_09
num_bytes: 184925
num_examples: 31
- name: 2017_q4_11
num_bytes: 118811
num_examples: 25
- name: 2017_q4_10
num_bytes: 148118
num_examples: 27
- name: 2017_q4_12
num_bytes: 130539
num_examples: 25
- name: 2018_q1_01
num_bytes: 90900
num_examples: 19
- name: 2018_q1_02
num_bytes: 52841
num_examples: 11
- name: 2018_q1_03
num_bytes: 51113
num_examples: 8
- name: 2018_q2_04
num_bytes: 53700
num_examples: 12
- name: 2018_q2_05
num_bytes: 70419
num_examples: 11
- name: 2018_q2_06
num_bytes: 59884
num_examples: 11
- name: 2018_q3_07
num_bytes: 39363
num_examples: 7
- name: 2018_q3_08
num_bytes: 45066
num_examples: 10
- name: 2018_q3_09
num_bytes: 47807
num_examples: 6
- name: 2018_q4_10
num_bytes: 36896
num_examples: 9
- name: 2018_q4_11
num_bytes: 17234
num_examples: 5
- name: 2018_q4_12
num_bytes: 18698
num_examples: 4
- name: 2019_q1_01
num_bytes: 13005
num_examples: 3
- name: 2019_q1_02
num_bytes: 24149
num_examples: 5
- name: 2019_q1_03
num_bytes: 27307
num_examples: 6
- name: 2019_q2_04
num_bytes: 61434
num_examples: 6
- name: 2019_q2_05
num_bytes: 39358
num_examples: 8
- name: 2019_q2_06
num_bytes: 233591
num_examples: 27
- name: 2019_q3_07
num_bytes: 350304
num_examples: 38
- name: 2019_q3_08
num_bytes: 286313
num_examples: 33
- name: 2019_q3_09
num_bytes: 236229
num_examples: 29
- name: 2019_q4_10
num_bytes: 38740
num_examples: 4
- name: 2019_q4_11
num_bytes: 22457
num_examples: 5
- name: 2019_q4_12
num_bytes: 7591
num_examples: 1
- name: 2020_q1_01
num_bytes: 95540
num_examples: 21
- name: 2020_q1_02
num_bytes: 8725
num_examples: 2
- name: 2021_q1_01
num_bytes: 102453
num_examples: 12
- name: 2020_q1_03
num_bytes: 9171
num_examples: 1
- name: 2020_q2_04
num_bytes: 16929
num_examples: 3
- name: 2020_q2_05
num_bytes: 10720
num_examples: 4
- name: 2020_q2_06
num_bytes: 8845
num_examples: 1
- name: 2020_q3_07
num_bytes: 5522
num_examples: 1
- name: 2020_q3_08
num_bytes: 15056
num_examples: 2
- name: 2020_q4_10
num_bytes: 9631
num_examples: 2
- name: 2020_q4_11
num_bytes: 26447
num_examples: 4
- name: 2021_q2_06
num_bytes: 22197
num_examples: 3
- name: 2021_q1_02
num_bytes: 9622
num_examples: 2
- name: 2021_q1_03
num_bytes: 76395
num_examples: 10
- name: 2021_q2_04
num_bytes: 10197
num_examples: 2
- name: 2021_q2_05
num_bytes: 28178
num_examples: 3
- name: 2021_q3_08
num_bytes: 2544
num_examples: 1
- name: 2021_q3_09
num_bytes: 20006
num_examples: 3
- name: 2021_q4_10
num_bytes: 23564
num_examples: 2
- name: 2021_q4_11
num_bytes: 6302
num_examples: 1
- name: 2021_q4_12
num_bytes: 22592
num_examples: 4
- name: 2022_q1_01
num_bytes: 43817
num_examples: 9
- name: 2022_q1_02
num_bytes: 52804
num_examples: 7
- name: 2022_q1_03
num_bytes: 18215
num_examples: 1
- name: 2022_q2_04
num_bytes: 22646
num_examples: 5
- name: 2022_q2_05
num_bytes: 36351
num_examples: 4
- name: 2022_q2_06
num_bytes: 17348
num_examples: 3
- name: 2022_q3_07
num_bytes: 22670
num_examples: 3
- name: 2022_q3_08
num_bytes: 13727
num_examples: 2
- name: 2022_q3_09
num_bytes: 22554
num_examples: 3
- name: 2022_q4_10
num_bytes: 38663
num_examples: 7
- name: 2022_q4_11
num_bytes: 64207
num_examples: 11
- name: 2022_q4_12
num_bytes: 23709
num_examples: 3
- name: 2023_q1_01
num_bytes: 14467
num_examples: 3
- name: 2023_q1_02
num_bytes: 33866
num_examples: 6
- name: 2023_q1_03
num_bytes: 30961
num_examples: 3
- name: 2023_q2_05
num_bytes: 19571
num_examples: 2
- name: 2023_q2_06
num_bytes: 14605
num_examples: 3
- name: 2023_q3_07
num_bytes: 22299
num_examples: 3
- name: 2023_q3_09
num_bytes: 9213
num_examples: 2
- name: 2023_q4_10
num_bytes: 5374
num_examples: 1
- name: 2023_q4_11
num_bytes: 52678
num_examples: 8
- name: 2023_q4_12
num_bytes: 47428
num_examples: 6
- name: 2024_q1_01
num_bytes: 140960
num_examples: 15
download_size: 47849068
dataset_size: 73510211
- config_name: en
features:
- name: language
dtype: string
- name: wiki_page_id
dtype: string
- name: wiki_revision_id
dtype: string
- name: revision_timestamp
dtype: timestamp[us, tz=UTC]
- name: revision_year
dtype: uint16
- name: revision_month
dtype: uint16
- name: article_timestamp
dtype: timestamp[us, tz=UTC]
- name: article_year
dtype: uint16
- name: article_month
dtype: uint16
- name: url
dtype: string
- name: title
dtype: string
- name: raw_text
dtype: string
- name: cleaned_text
dtype: string
- name: categories
sequence: string
- name: sources
sequence: string
- name: dump
dtype: string
splits:
- name: 2004_q4_11
num_bytes: 355384
num_examples: 73
- name: no_date
num_bytes: 905609
num_examples: 770
- name: 2004_q4_12
num_bytes: 512436
num_examples: 97
- name: 2005_q1_01
num_bytes: 923286
num_examples: 220
- name: 2007_q1_01
num_bytes: 1342223
num_examples: 270
- name: 2005_q1_02
num_bytes: 1117537
num_examples: 244
- name: 2005_q2_04
num_bytes: 2350702
num_examples: 422
- name: 2015_q3_08
num_bytes: 295824
num_examples: 41
- name: 2005_q1_03
num_bytes: 1883036
num_examples: 363
- name: 2024_q1_03
num_bytes: 4108
num_examples: 1
- name: 2024_q2_04
num_bytes: 8733
num_examples: 1
- name: 2005_q2_05
num_bytes: 1686948
num_examples: 288
- name: 2005_q3_09
num_bytes: 1807812
num_examples: 357
- name: 2005_q2_06
num_bytes: 1225746
num_examples: 248
- name: 2005_q3_07
num_bytes: 1840766
num_examples: 397
- name: 2005_q3_08
num_bytes: 1308424
num_examples: 270
- name: 2006_q4_12
num_bytes: 1138354
num_examples: 210
- name: 2005_q4_10
num_bytes: 1251164
num_examples: 256
- name: 2005_q4_11
num_bytes: 1126641
num_examples: 204
- name: 2005_q4_12
num_bytes: 1322668
num_examples: 278
- name: 2006_q1_01
num_bytes: 1918408
num_examples: 351
- name: 2006_q1_03
num_bytes: 1646980
num_examples: 316
- name: 2006_q1_02
num_bytes: 1447222
num_examples: 251
- name: 2009_q1_03
num_bytes: 1463180
num_examples: 237
- name: 2006_q2_04
num_bytes: 1198789
num_examples: 229
- name: 2006_q2_05
num_bytes: 1919629
num_examples: 338
- name: 2006_q2_06
num_bytes: 2131064
num_examples: 384
- name: 2006_q3_07
num_bytes: 1467690
num_examples: 263
- name: 2006_q3_08
num_bytes: 1773696
num_examples: 361
- name: 2006_q4_10
num_bytes: 1559210
num_examples: 278
- name: 2006_q3_09
num_bytes: 1369162
num_examples: 287
- name: 2006_q4_11
num_bytes: 2330043
num_examples: 355
- name: 2007_q1_02
num_bytes: 1064677
num_examples: 213
- name: 2007_q1_03
num_bytes: 1808225
num_examples: 306
- name: 2007_q2_05
num_bytes: 1616371
num_examples: 286
- name: 2007_q2_04
num_bytes: 1717390
num_examples: 292
- name: 2007_q3_08
num_bytes: 1550665
num_examples: 308
- name: 2011_q4_10
num_bytes: 211069
num_examples: 37
- name: 2008_q2_06
num_bytes: 1559641
num_examples: 276
- name: 2009_q4_11
num_bytes: 1478810
num_examples: 256
- name: 2007_q4_10
num_bytes: 3000908
num_examples: 351
- name: 2007_q2_06
num_bytes: 1746377
num_examples: 292
- name: 2007_q4_11
num_bytes: 1741085
num_examples: 245
- name: 2007_q3_07
num_bytes: 1727844
num_examples: 319
- name: 2007_q3_09
num_bytes: 2605555
num_examples: 387
- name: 2008_q1_01
num_bytes: 1521809
num_examples: 245
- name: 2007_q4_12
num_bytes: 1611466
num_examples: 230
- name: 2009_q1_01
num_bytes: 1566535
num_examples: 188
- name: 2008_q1_02
num_bytes: 2336357
num_examples: 304
- name: 2008_q1_03
num_bytes: 2306340
num_examples: 287
- name: 2008_q2_04
num_bytes: 1614034
num_examples: 243
- name: 2008_q3_08
num_bytes: 1230646
num_examples: 199
- name: 2008_q2_05
num_bytes: 1652925
num_examples: 251
- name: 2009_q3_08
num_bytes: 936595
num_examples: 169
- name: 2008_q3_07
num_bytes: 1294765
num_examples: 228
- name: 2008_q3_09
num_bytes: 1599293
num_examples: 226
- name: 2009_q3_09
num_bytes: 1006084
num_examples: 188
- name: 2009_q3_07
num_bytes: 761938
num_examples: 123
- name: 2008_q4_10
num_bytes: 1196006
num_examples: 166
- name: 2008_q4_11
num_bytes: 910754
num_examples: 140
- name: 2008_q4_12
num_bytes: 1086771
num_examples: 152
- name: 2009_q1_02
num_bytes: 1325042
num_examples: 163
- name: 2009_q2_05
num_bytes: 903789
num_examples: 128
- name: 2009_q2_04
num_bytes: 1120911
num_examples: 168
- name: 2009_q2_06
num_bytes: 1060240
num_examples: 177
- name: 2009_q4_10
num_bytes: 1399598
num_examples: 250
- name: 2009_q4_12
num_bytes: 1069161
num_examples: 191
- name: 2010_q1_01
num_bytes: 1738252
num_examples: 284
- name: 2010_q2_04
num_bytes: 1046709
num_examples: 178
- name: 2010_q1_02
num_bytes: 871529
num_examples: 135
- name: 2010_q1_03
num_bytes: 962769
num_examples: 179
- name: 2010_q4_12
num_bytes: 497413
num_examples: 83
- name: 2010_q2_05
num_bytes: 1261795
num_examples: 204
- name: 2010_q3_09
num_bytes: 726326
num_examples: 123
- name: 2010_q2_06
num_bytes: 786041
num_examples: 145
- name: 2010_q3_07
num_bytes: 834256
num_examples: 167
- name: 2010_q3_08
num_bytes: 808810
num_examples: 148
- name: 2010_q4_10
num_bytes: 678136
num_examples: 105
- name: 2010_q4_11
num_bytes: 504834
num_examples: 91
- name: 2011_q4_12
num_bytes: 408203
num_examples: 51
- name: 2011_q1_01
num_bytes: 322722
num_examples: 58
- name: 2011_q1_02
num_bytes: 313174
num_examples: 55
- name: 2011_q1_03
num_bytes: 854825
num_examples: 148
- name: 2011_q2_04
num_bytes: 759125
num_examples: 117
- name: 2011_q2_05
num_bytes: 482716
num_examples: 73
- name: 2011_q2_06
num_bytes: 587303
num_examples: 80
- name: 2011_q3_07
num_bytes: 409714
num_examples: 62
- name: 2011_q3_08
num_bytes: 529718
num_examples: 87
- name: 2011_q3_09
num_bytes: 190011
num_examples: 33
- name: 2011_q4_11
num_bytes: 238284
num_examples: 36
- name: 2012_q1_01
num_bytes: 471820
num_examples: 61
- name: 2012_q2_05
num_bytes: 489742
num_examples: 51
- name: 2012_q1_02
num_bytes: 380246
num_examples: 50
- name: 2012_q1_03
num_bytes: 543120
num_examples: 79
- name: 2012_q2_04
num_bytes: 504965
num_examples: 67
- name: 2012_q2_06
num_bytes: 361222
num_examples: 41
- name: 2012_q3_07
num_bytes: 254343
num_examples: 36
- name: 2012_q3_08
num_bytes: 309427
num_examples: 60
- name: 2012_q3_09
num_bytes: 525534
num_examples: 68
- name: 2012_q4_10
num_bytes: 203830
num_examples: 26
- name: 2012_q4_11
num_bytes: 266817
num_examples: 38
- name: 2012_q4_12
num_bytes: 414347
num_examples: 49
- name: 2013_q1_02
num_bytes: 411979
num_examples: 53
- name: 2013_q1_01
num_bytes: 275627
num_examples: 37
- name: 2013_q1_03
num_bytes: 273876
num_examples: 39
- name: 2013_q2_04
num_bytes: 251845
num_examples: 38
- name: 2013_q2_05
num_bytes: 130590
num_examples: 26
- name: 2013_q2_06
num_bytes: 171750
num_examples: 27
- name: 2013_q3_07
num_bytes: 188012
num_examples: 29
- name: 2013_q3_08
num_bytes: 299340
num_examples: 51
- name: 2013_q3_09
num_bytes: 323758
num_examples: 35
- name: 2013_q4_10
num_bytes: 262738
num_examples: 27
- name: 2013_q4_11
num_bytes: 168842
num_examples: 23
- name: 2013_q4_12
num_bytes: 195190
num_examples: 29
- name: 2014_q1_01
num_bytes: 218250
num_examples: 20
- name: 2014_q1_02
num_bytes: 178689
num_examples: 26
- name: 2014_q1_03
num_bytes: 451829
num_examples: 44
- name: 2014_q2_04
num_bytes: 198848
num_examples: 28
- name: 2014_q2_05
num_bytes: 164722
num_examples: 29
- name: 2014_q2_06
num_bytes: 197918
num_examples: 30
- name: 2014_q3_07
num_bytes: 201585
num_examples: 26
- name: 2014_q3_08
num_bytes: 262071
num_examples: 40
- name: 2014_q3_09
num_bytes: 143604
num_examples: 27
- name: 2014_q4_11
num_bytes: 139744
num_examples: 15
- name: 2014_q4_10
num_bytes: 82130
num_examples: 15
- name: 2014_q4_12
num_bytes: 170632
num_examples: 23
- name: 2015_q1_01
num_bytes: 152919
num_examples: 24
- name: 2015_q1_02
num_bytes: 121485
num_examples: 23
- name: 2015_q1_03
num_bytes: 207556
num_examples: 41
- name: 2015_q2_04
num_bytes: 111761
num_examples: 23
- name: 2015_q2_05
num_bytes: 207422
num_examples: 35
- name: 2015_q2_06
num_bytes: 106895
num_examples: 16
- name: 2015_q3_07
num_bytes: 61401
num_examples: 12
- name: 2015_q3_09
num_bytes: 156692
num_examples: 28
- name: 2015_q4_10
num_bytes: 208696
num_examples: 30
- name: 2015_q4_11
num_bytes: 110322
num_examples: 21
- name: 2015_q4_12
num_bytes: 155700
num_examples: 22
- name: 2016_q1_01
num_bytes: 72442
num_examples: 16
- name: 2016_q1_02
num_bytes: 75316
num_examples: 16
- name: 2016_q1_03
num_bytes: 144236
num_examples: 12
- name: 2016_q2_04
num_bytes: 53875
num_examples: 11
- name: 2016_q2_05
num_bytes: 183934
num_examples: 37
- name: 2016_q2_06
num_bytes: 318907
num_examples: 45
- name: 2016_q3_07
num_bytes: 280617
num_examples: 41
- name: 2016_q3_08
num_bytes: 157755
num_examples: 21
- name: 2016_q3_09
num_bytes: 186193
num_examples: 17
- name: 2016_q4_10
num_bytes: 165777
num_examples: 19
- name: 2016_q4_11
num_bytes: 270544
num_examples: 20
- name: 2016_q4_12
num_bytes: 267447
num_examples: 44
- name: 2017_q1_01
num_bytes: 329229
num_examples: 50
- name: 2017_q1_02
num_bytes: 129565
num_examples: 17
- name: 2017_q1_03
num_bytes: 114917
num_examples: 20
- name: 2017_q2_04
num_bytes: 97982
num_examples: 16
- name: 2017_q2_05
num_bytes: 205312
num_examples: 25
- name: 2018_q1_01
num_bytes: 192212
num_examples: 27
- name: 2017_q2_06
num_bytes: 176035
num_examples: 26
- name: 2017_q3_07
num_bytes: 128542
num_examples: 20
- name: 2017_q3_08
num_bytes: 127170
num_examples: 16
- name: 2017_q4_10
num_bytes: 194174
num_examples: 27
- name: 2017_q3_09
num_bytes: 148202
num_examples: 23
- name: 2017_q4_11
num_bytes: 108905
num_examples: 18
- name: 2017_q4_12
num_bytes: 135560
num_examples: 16
- name: 2018_q1_02
num_bytes: 155993
num_examples: 24
- name: 2018_q1_03
num_bytes: 178247
num_examples: 23
- name: 2018_q2_04
num_bytes: 112064
num_examples: 14
- name: 2018_q2_05
num_bytes: 281623
num_examples: 29
- name: 2018_q2_06
num_bytes: 403538
num_examples: 32
- name: 2018_q3_07
num_bytes: 313388
num_examples: 38
- name: 2018_q3_08
num_bytes: 61281
num_examples: 10
- name: 2018_q3_09
num_bytes: 50511
num_examples: 6
- name: 2018_q4_10
num_bytes: 88082
num_examples: 11
- name: 2019_q1_01
num_bytes: 198352
num_examples: 21
- name: 2018_q4_11
num_bytes: 142523
num_examples: 12
- name: 2018_q4_12
num_bytes: 94151
num_examples: 13
- name: 2019_q1_02
num_bytes: 98752
num_examples: 17
- name: 2019_q1_03
num_bytes: 70330
num_examples: 8
- name: 2019_q2_04
num_bytes: 31683
num_examples: 6
- name: 2019_q2_05
num_bytes: 86466
num_examples: 10
- name: 2019_q2_06
num_bytes: 28172
num_examples: 5
- name: 2019_q3_07
num_bytes: 63008
num_examples: 11
- name: 2019_q3_08
num_bytes: 85796
num_examples: 13
- name: 2019_q3_09
num_bytes: 240696
num_examples: 32
- name: 2019_q4_10
num_bytes: 195100
num_examples: 31
- name: 2019_q4_11
num_bytes: 65136
num_examples: 9
- name: 2019_q4_12
num_bytes: 56507
num_examples: 8
- name: 2020_q1_01
num_bytes: 109732
num_examples: 15
- name: 2020_q1_02
num_bytes: 71828
num_examples: 10
- name: 2020_q1_03
num_bytes: 152494
num_examples: 17
- name: 2020_q2_04
num_bytes: 212284
num_examples: 10
- name: 2020_q3_08
num_bytes: 93948
num_examples: 8
- name: 2020_q2_05
num_bytes: 73226
num_examples: 7
- name: 2020_q2_06
num_bytes: 252008
num_examples: 13
- name: 2020_q3_07
num_bytes: 140296
num_examples: 6
- name: 2020_q3_09
num_bytes: 91403
num_examples: 7
- name: 2020_q4_10
num_bytes: 275977
num_examples: 11
- name: 2020_q4_12
num_bytes: 173083
num_examples: 7
- name: 2020_q4_11
num_bytes: 154144
num_examples: 8
- name: 2021_q2_04
num_bytes: 466728
num_examples: 19
- name: 2021_q1_01
num_bytes: 111514
num_examples: 6
- name: 2021_q1_02
num_bytes: 96375
num_examples: 15
- name: 2021_q1_03
num_bytes: 106729
num_examples: 14
- name: 2021_q2_05
num_bytes: 143611
num_examples: 10
- name: 2021_q2_06
num_bytes: 260818
num_examples: 11
- name: 2021_q3_07
num_bytes: 264632
num_examples: 26
- name: 2021_q3_08
num_bytes: 134875
num_examples: 15
- name: 2021_q3_09
num_bytes: 160030
num_examples: 11
- name: 2021_q4_10
num_bytes: 128165
num_examples: 13
- name: 2021_q4_11
num_bytes: 80221
num_examples: 11
- name: 2022_q1_02
num_bytes: 154569
num_examples: 22
- name: 2021_q4_12
num_bytes: 26156
num_examples: 5
- name: 2022_q1_01
num_bytes: 80374
num_examples: 9
- name: 2022_q1_03
num_bytes: 225354
num_examples: 30
- name: 2022_q2_04
num_bytes: 104946
num_examples: 19
- name: 2022_q2_05
num_bytes: 97917
num_examples: 11
- name: 2022_q2_06
num_bytes: 75563
num_examples: 10
- name: 2022_q3_07
num_bytes: 46100
num_examples: 7
- name: 2022_q3_08
num_bytes: 29657
num_examples: 5
- name: 2022_q3_09
num_bytes: 118604
num_examples: 16
- name: 2022_q4_11
num_bytes: 9809
num_examples: 2
- name: 2022_q4_10
num_bytes: 7127
num_examples: 1
- name: 2022_q4_12
num_bytes: 14544
num_examples: 2
- name: 2023_q1_01
num_bytes: 106175
num_examples: 13
- name: 2023_q1_02
num_bytes: 203995
num_examples: 25
- name: 2023_q1_03
num_bytes: 189811
num_examples: 16
- name: 2023_q2_04
num_bytes: 247611
num_examples: 29
- name: 2023_q2_05
num_bytes: 189881
num_examples: 24
- name: 2023_q2_06
num_bytes: 63767
num_examples: 10
- name: 2023_q3_07
num_bytes: 98644
num_examples: 13
- name: 2023_q3_08
num_bytes: 112643
num_examples: 10
- name: 2023_q3_09
num_bytes: 51015
num_examples: 5
- name: 2023_q4_10
num_bytes: 18434
num_examples: 1
- name: 2023_q4_12
num_bytes: 18031
num_examples: 2
- name: 2023_q4_11
num_bytes: 4991
num_examples: 1
- name: 2024_q1_01
num_bytes: 116096
num_examples: 16
download_size: 82011071
dataset_size: 135006828
- config_name: es
features:
- name: language
dtype: string
- name: wiki_page_id
dtype: string
- name: wiki_revision_id
dtype: string
- name: revision_timestamp
dtype: timestamp[us, tz=UTC]
- name: revision_year
dtype: uint16
- name: revision_month
dtype: uint16
- name: article_timestamp
dtype: timestamp[us, tz=UTC]
- name: article_year
dtype: uint16
- name: article_month
dtype: uint16
- name: url
dtype: string
- name: title
dtype: string
- name: raw_text
dtype: string
- name: cleaned_text
dtype: string
- name: categories
sequence: string
- name: sources
sequence: string
- name: dump
dtype: string
splits:
- name: 2005_q1_01
num_bytes: 22834
num_examples: 3
- name: 2005_q1_02
num_bytes: 233476
num_examples: 50
- name: 2004_q1_02
num_bytes: 6301
num_examples: 1
- name: 2005_q1_03
num_bytes: 339366
num_examples: 52
- name: no_date
num_bytes: 2274435
num_examples: 2377
- name: 2005_q2_04
num_bytes: 453620
num_examples: 65
- name: 2005_q2_05
num_bytes: 435347
num_examples: 75
- name: 2005_q2_06
num_bytes: 652616
num_examples: 95
- name: 2005_q3_07
num_bytes: 472426
num_examples: 85
- name: 2005_q3_08
num_bytes: 543954
num_examples: 83
- name: 2005_q3_09
num_bytes: 591913
num_examples: 110
- name: 2005_q4_10
num_bytes: 672556
num_examples: 133
- name: 2005_q4_12
num_bytes: 691112
num_examples: 134
- name: 2006_q4_10
num_bytes: 197043
num_examples: 30
- name: 2005_q4_11
num_bytes: 504460
num_examples: 91
- name: 2006_q1_01
num_bytes: 659083
num_examples: 102
- name: 2006_q1_02
num_bytes: 282153
num_examples: 50
- name: 2006_q1_03
num_bytes: 464141
num_examples: 64
- name: 2006_q2_04
num_bytes: 191351
num_examples: 39
- name: 2006_q2_05
num_bytes: 191681
num_examples: 32
- name: 2006_q2_06
num_bytes: 258078
num_examples: 37
- name: 2006_q3_07
num_bytes: 222353
num_examples: 38
- name: 2006_q3_08
num_bytes: 360101
num_examples: 61
- name: 2006_q3_09
num_bytes: 312815
num_examples: 53
- name: 2006_q4_11
num_bytes: 294913
num_examples: 50
- name: 2006_q4_12
num_bytes: 134637
num_examples: 17
- name: 2007_q1_01
num_bytes: 157406
num_examples: 32
- name: 2007_q1_02
num_bytes: 149871
num_examples: 34
- name: 2007_q1_03
num_bytes: 137884
num_examples: 26
- name: 2007_q2_04
num_bytes: 105702
num_examples: 24
- name: 2007_q2_05
num_bytes: 91214
num_examples: 18
- name: 2007_q2_06
num_bytes: 218646
num_examples: 40
- name: 2007_q3_07
num_bytes: 622054
num_examples: 117
- name: 2007_q3_08
num_bytes: 905966
num_examples: 182
- name: 2007_q3_09
num_bytes: 930518
num_examples: 194
- name: 2007_q4_10
num_bytes: 870495
num_examples: 176
- name: 2007_q4_11
num_bytes: 764502
num_examples: 150
- name: 2007_q4_12
num_bytes: 361356
num_examples: 88
- name: 2008_q1_01
num_bytes: 401702
num_examples: 93
- name: 2008_q1_02
num_bytes: 273081
num_examples: 63
- name: 2008_q1_03
num_bytes: 326261
num_examples: 66
- name: 2008_q2_04
num_bytes: 192046
num_examples: 42
- name: 2008_q2_05
num_bytes: 95568
num_examples: 23
- name: 2008_q2_06
num_bytes: 110130
num_examples: 22
- name: 2008_q3_07
num_bytes: 254800
num_examples: 57
- name: 2008_q3_08
num_bytes: 247018
num_examples: 53
- name: 2008_q3_09
num_bytes: 633283
num_examples: 121
- name: 2008_q4_10
num_bytes: 1436212
num_examples: 263
- name: 2008_q4_11
num_bytes: 1144816
num_examples: 223
- name: 2008_q4_12
num_bytes: 414501
num_examples: 83
- name: 2009_q1_01
num_bytes: 286225
num_examples: 53
- name: 2009_q1_02
num_bytes: 99781
num_examples: 23
- name: 2009_q1_03
num_bytes: 324388
num_examples: 64
- name: 2009_q2_04
num_bytes: 211490
num_examples: 40
- name: 2009_q2_05
num_bytes: 542402
num_examples: 88
- name: 2009_q2_06
num_bytes: 672529
num_examples: 138
- name: 2009_q3_07
num_bytes: 344957
num_examples: 71
- name: 2009_q3_08
num_bytes: 436349
num_examples: 91
- name: 2009_q3_09
num_bytes: 413367
num_examples: 91
- name: 2009_q4_10
num_bytes: 872489
num_examples: 177
- name: 2009_q4_11
num_bytes: 531836
num_examples: 107
- name: 2009_q4_12
num_bytes: 319925
num_examples: 66
- name: 2010_q1_01
num_bytes: 403994
num_examples: 86
- name: 2010_q1_02
num_bytes: 544627
num_examples: 107
- name: 2010_q1_03
num_bytes: 383184
num_examples: 82
- name: 2011_q1_02
num_bytes: 432905
num_examples: 92
- name: 2010_q2_04
num_bytes: 303778
num_examples: 60
- name: 2010_q2_05
num_bytes: 370187
num_examples: 78
- name: 2010_q2_06
num_bytes: 656462
num_examples: 125
- name: 2010_q3_07
num_bytes: 315247
num_examples: 59
- name: 2010_q3_08
num_bytes: 271941
num_examples: 61
- name: 2010_q3_09
num_bytes: 312488
num_examples: 65
- name: 2010_q4_10
num_bytes: 393973
num_examples: 69
- name: 2010_q4_11
num_bytes: 403671
num_examples: 79
- name: 2010_q4_12
num_bytes: 531468
num_examples: 101
- name: 2011_q1_01
num_bytes: 444023
num_examples: 91
- name: 2013_q2_04
num_bytes: 691382
num_examples: 124
- name: 2011_q1_03
num_bytes: 473642
num_examples: 108
- name: 2011_q2_04
num_bytes: 356268
num_examples: 76
- name: 2011_q2_05
num_bytes: 351852
num_examples: 73
- name: 2011_q2_06
num_bytes: 186976
num_examples: 41
- name: 2011_q3_07
num_bytes: 461515
num_examples: 89
- name: 2011_q3_08
num_bytes: 295017
num_examples: 65
- name: 2011_q3_09
num_bytes: 412812
num_examples: 81
- name: 2011_q4_10
num_bytes: 740301
num_examples: 137
- name: 2011_q4_11
num_bytes: 597173
num_examples: 104
- name: 2011_q4_12
num_bytes: 586942
num_examples: 107
- name: 2012_q1_01
num_bytes: 711074
num_examples: 110
- name: 2012_q1_02
num_bytes: 320729
num_examples: 62
- name: 2012_q1_03
num_bytes: 232611
num_examples: 44
- name: 2012_q2_04
num_bytes: 256016
num_examples: 43
- name: 2012_q2_05
num_bytes: 301007
num_examples: 55
- name: 2012_q2_06
num_bytes: 233785
num_examples: 45
- name: 2012_q3_07
num_bytes: 232577
num_examples: 46
- name: 2012_q3_08
num_bytes: 248766
num_examples: 44
- name: 2012_q3_09
num_bytes: 164488
num_examples: 30
- name: 2012_q4_10
num_bytes: 218492
num_examples: 49
- name: 2012_q4_11
num_bytes: 197981
num_examples: 33
- name: 2012_q4_12
num_bytes: 201324
num_examples: 43
- name: 2013_q1_01
num_bytes: 248580
num_examples: 46
- name: 2013_q1_02
num_bytes: 344429
num_examples: 68
- name: 2013_q1_03
num_bytes: 303547
num_examples: 72
- name: 2013_q2_05
num_bytes: 497282
num_examples: 96
- name: 2013_q2_06
num_bytes: 294098
num_examples: 58
- name: 2013_q3_07
num_bytes: 394460
num_examples: 83
- name: 2013_q3_08
num_bytes: 203708
num_examples: 45
- name: 2013_q3_09
num_bytes: 180442
num_examples: 30
- name: 2013_q4_11
num_bytes: 123514
num_examples: 20
- name: 2013_q4_10
num_bytes: 190747
num_examples: 37
- name: 2013_q4_12
num_bytes: 158416
num_examples: 34
- name: 2014_q1_01
num_bytes: 239960
num_examples: 55
- name: 2014_q1_02
num_bytes: 188424
num_examples: 36
- name: 2024_q1_02
num_bytes: 4466
num_examples: 1
- name: 2014_q1_03
num_bytes: 123682
num_examples: 29
- name: 2014_q2_04
num_bytes: 212336
num_examples: 32
- name: 2014_q2_05
num_bytes: 336997
num_examples: 48
- name: 2014_q2_06
num_bytes: 493155
num_examples: 83
- name: 2014_q3_07
num_bytes: 395369
num_examples: 63
- name: 2014_q3_08
num_bytes: 171711
num_examples: 37
- name: 2014_q3_09
num_bytes: 806133
num_examples: 198
- name: 2014_q4_10
num_bytes: 743446
num_examples: 165
- name: 2014_q4_11
num_bytes: 629966
num_examples: 129
- name: 2014_q4_12
num_bytes: 386373
num_examples: 72
- name: 2015_q1_01
num_bytes: 176652
num_examples: 42
- name: 2015_q1_02
num_bytes: 108544
num_examples: 21
- name: 2015_q1_03
num_bytes: 78889
num_examples: 14
- name: 2015_q2_04
num_bytes: 47374
num_examples: 9
- name: 2015_q2_05
num_bytes: 200731
num_examples: 32
- name: 2015_q2_06
num_bytes: 251791
num_examples: 42
- name: 2015_q3_07
num_bytes: 225308
num_examples: 39
- name: 2015_q3_08
num_bytes: 30863
num_examples: 7
- name: 2015_q3_09
num_bytes: 41992
num_examples: 10
- name: 2015_q4_10
num_bytes: 118626
num_examples: 20
- name: 2015_q4_11
num_bytes: 48667
num_examples: 9
- name: 2015_q4_12
num_bytes: 35187
num_examples: 7
- name: 2016_q1_01
num_bytes: 149348
num_examples: 18
- name: 2016_q1_02
num_bytes: 192286
num_examples: 32
- name: 2016_q1_03
num_bytes: 109103
num_examples: 27
- name: 2016_q2_04
num_bytes: 187748
num_examples: 50
- name: 2016_q2_05
num_bytes: 167480
num_examples: 45
- name: 2016_q2_06
num_bytes: 230260
num_examples: 44
- name: 2016_q3_07
num_bytes: 101086
num_examples: 19
- name: 2016_q3_08
num_bytes: 82197
num_examples: 21
- name: 2016_q3_09
num_bytes: 247076
num_examples: 61
- name: 2016_q4_10
num_bytes: 362229
num_examples: 75
- name: 2016_q4_11
num_bytes: 158362
num_examples: 26
- name: 2016_q4_12
num_bytes: 107976
num_examples: 22
- name: 2017_q1_01
num_bytes: 203078
num_examples: 39
- name: 2017_q1_02
num_bytes: 106332
num_examples: 19
- name: 2017_q1_03
num_bytes: 301224
num_examples: 53
- name: 2017_q2_04
num_bytes: 295769
num_examples: 49
- name: 2017_q2_05
num_bytes: 169720
num_examples: 25
- name: 2017_q2_06
num_bytes: 240547
num_examples: 45
- name: 2017_q3_07
num_bytes: 105225
num_examples: 21
- name: 2017_q3_08
num_bytes: 303410
num_examples: 60
- name: 2017_q3_09
num_bytes: 553936
num_examples: 108
- name: 2017_q4_10
num_bytes: 376652
num_examples: 72
- name: 2017_q4_11
num_bytes: 417053
num_examples: 77
- name: 2017_q4_12
num_bytes: 251968
num_examples: 37
- name: 2018_q1_01
num_bytes: 181744
num_examples: 31
- name: 2018_q1_02
num_bytes: 150967
num_examples: 25
- name: 2018_q1_03
num_bytes: 38974
num_examples: 6
- name: 2018_q2_04
num_bytes: 48760
num_examples: 10
- name: 2018_q2_05
num_bytes: 83367
num_examples: 13
- name: 2018_q2_06
num_bytes: 349247
num_examples: 64
- name: 2018_q3_07
num_bytes: 124625
num_examples: 22
- name: 2018_q3_08
num_bytes: 62018
num_examples: 13
- name: 2018_q3_09
num_bytes: 279418
num_examples: 42
- name: 2018_q4_10
num_bytes: 445009
num_examples: 67
- name: 2018_q4_11
num_bytes: 210471
num_examples: 34
- name: 2018_q4_12
num_bytes: 292570
num_examples: 44
- name: 2019_q1_01
num_bytes: 63816
num_examples: 9
- name: 2019_q1_02
num_bytes: 24394
num_examples: 4
- name: 2019_q1_03
num_bytes: 60810
num_examples: 12
- name: 2019_q2_04
num_bytes: 48307
num_examples: 6
- name: 2019_q2_05
num_bytes: 49833
num_examples: 10
- name: 2019_q2_06
num_bytes: 172268
num_examples: 18
- name: 2019_q3_07
num_bytes: 141494
num_examples: 19
- name: 2019_q3_08
num_bytes: 86851
num_examples: 19
- name: 2019_q3_09
num_bytes: 117671
num_examples: 16
- name: 2019_q4_10
num_bytes: 120565
num_examples: 19
- name: 2019_q4_11
num_bytes: 88188
num_examples: 14
- name: 2019_q4_12
num_bytes: 26262
num_examples: 5
- name: 2020_q1_01
num_bytes: 57439
num_examples: 8
- name: 2020_q1_02
num_bytes: 19420
num_examples: 3
- name: 2020_q1_03
num_bytes: 93494
num_examples: 22
- name: 2020_q2_04
num_bytes: 37077
num_examples: 7
- name: 2020_q2_05
num_bytes: 36453
num_examples: 7
- name: 2020_q2_06
num_bytes: 24164
num_examples: 5
- name: 2020_q3_07
num_bytes: 14715
num_examples: 3
- name: 2020_q3_08
num_bytes: 28362
num_examples: 8
- name: 2020_q3_09
num_bytes: 22765
num_examples: 4
- name: 2020_q4_10
num_bytes: 59278
num_examples: 13
- name: 2020_q4_11
num_bytes: 61114
num_examples: 13
- name: 2020_q4_12
num_bytes: 52168
num_examples: 7
- name: 2021_q1_01
num_bytes: 107297
num_examples: 15
- name: 2021_q1_02
num_bytes: 551905
num_examples: 141
- name: 2021_q1_03
num_bytes: 113380
num_examples: 31
- name: 2021_q2_04
num_bytes: 9452
num_examples: 2
- name: 2021_q2_05
num_bytes: 69603
num_examples: 9
- name: 2021_q2_06
num_bytes: 122602
num_examples: 27
- name: 2021_q3_07
num_bytes: 245586
num_examples: 48
- name: 2021_q3_08
num_bytes: 83868
num_examples: 13
- name: 2021_q3_09
num_bytes: 86838
num_examples: 15
- name: 2021_q4_10
num_bytes: 114199
num_examples: 23
- name: 2021_q4_11
num_bytes: 46345
num_examples: 7
- name: 2021_q4_12
num_bytes: 15776
num_examples: 4
- name: 2022_q1_01
num_bytes: 42600
num_examples: 7
- name: 2022_q1_02
num_bytes: 39354
num_examples: 5
- name: 2022_q1_03
num_bytes: 72378
num_examples: 12
- name: 2022_q2_04
num_bytes: 11865
num_examples: 2
- name: 2022_q2_05
num_bytes: 36853
num_examples: 8
- name: 2022_q2_06
num_bytes: 17141
num_examples: 4
- name: 2022_q3_07
num_bytes: 74939
num_examples: 8
- name: 2022_q3_08
num_bytes: 87205
num_examples: 12
- name: 2022_q3_09
num_bytes: 18645
num_examples: 3
- name: 2022_q4_10
num_bytes: 47541
num_examples: 10
- name: 2022_q4_11
num_bytes: 120206
num_examples: 12
- name: 2022_q4_12
num_bytes: 169657
num_examples: 20
- name: 2023_q1_01
num_bytes: 36657
num_examples: 7
- name: 2023_q1_02
num_bytes: 46722
num_examples: 6
- name: 2023_q1_03
num_bytes: 27402
num_examples: 3
- name: 2023_q2_04
num_bytes: 13893
num_examples: 1
- name: 2023_q2_05
num_bytes: 50930
num_examples: 6
- name: 2023_q2_06
num_bytes: 34042
num_examples: 6
- name: 2023_q3_07
num_bytes: 11075
num_examples: 3
- name: 2023_q3_08
num_bytes: 2382
num_examples: 1
- name: 2023_q3_09
num_bytes: 50741
num_examples: 12
- name: 2023_q4_10
num_bytes: 116810
num_examples: 22
- name: 2023_q4_11
num_bytes: 27589
num_examples: 5
- name: 2023_q4_12
num_bytes: 5193
num_examples: 1
- name: 2024_q1_01
num_bytes: 92430
num_examples: 19
download_size: 41613278
dataset_size: 62655322
- config_name: fr
features:
- name: language
dtype: string
- name: wiki_page_id
dtype: string
- name: wiki_revision_id
dtype: string
- name: revision_timestamp
dtype: timestamp[us, tz=UTC]
- name: revision_year
dtype: uint16
- name: revision_month
dtype: uint16
- name: article_timestamp
dtype: timestamp[us, tz=UTC]
- name: article_year
dtype: uint16
- name: article_month
dtype: uint16
- name: url
dtype: string
- name: title
dtype: string
- name: raw_text
dtype: string
- name: cleaned_text
dtype: string
- name: categories
sequence: string
- name: sources
sequence: string
- name: dump
dtype: string
splits:
- name: 2005_q1_01
num_bytes: 16817
num_examples: 6
- name: 2005_q1_02
num_bytes: 16779
num_examples: 6
- name: 2005_q1_03
num_bytes: 52900
num_examples: 12
- name: 2005_q2_04
num_bytes: 322624
num_examples: 67
- name: 2005_q2_05
num_bytes: 736748
num_examples: 124
- name: 2005_q2_06
num_bytes: 206315
num_examples: 38
- name: 2005_q3_07
num_bytes: 341858
num_examples: 75
- name: 2005_q3_08
num_bytes: 175651
num_examples: 40
- name: 2005_q3_09
num_bytes: 124613
num_examples: 42
- name: 2005_q4_10
num_bytes: 218677
num_examples: 69
- name: 2005_q4_11
num_bytes: 98059
num_examples: 25
- name: 2005_q4_12
num_bytes: 231509
num_examples: 55
- name: 2006_q1_01
num_bytes: 280790
num_examples: 83
- name: 2006_q1_02
num_bytes: 122819
num_examples: 29
- name: 2006_q1_03
num_bytes: 171362
num_examples: 46
- name: 2006_q2_04
num_bytes: 114247
num_examples: 36
- name: 2006_q2_05
num_bytes: 112003
num_examples: 41
- name: 2006_q2_06
num_bytes: 107226
num_examples: 45
- name: 2006_q3_07
num_bytes: 108261
num_examples: 40
- name: 2006_q3_08
num_bytes: 149308
num_examples: 48
- name: 2006_q3_09
num_bytes: 142886
num_examples: 50
- name: 2006_q4_10
num_bytes: 209705
num_examples: 59
- name: 2006_q4_11
num_bytes: 981787
num_examples: 154
- name: 2006_q4_12
num_bytes: 641794
num_examples: 122
- name: 2007_q1_01
num_bytes: 847168
num_examples: 128
- name: 2007_q1_02
num_bytes: 770129
num_examples: 113
- name: 2007_q1_03
num_bytes: 905907
num_examples: 94
- name: 2007_q2_04
num_bytes: 805192
num_examples: 143
- name: 2007_q2_05
num_bytes: 1086414
num_examples: 150
- name: no_date
num_bytes: 12114729
num_examples: 1007
- name: 2007_q2_06
num_bytes: 514209
num_examples: 100
- name: 2007_q3_07
num_bytes: 1173717
num_examples: 134
- name: 2007_q3_08
num_bytes: 596764
num_examples: 128
- name: 2007_q3_09
num_bytes: 580336
num_examples: 128
- name: 2007_q4_10
num_bytes: 529189
num_examples: 112
- name: 2007_q4_11
num_bytes: 411951
num_examples: 106
- name: 2007_q4_12
num_bytes: 422209
num_examples: 92
- name: 2008_q1_01
num_bytes: 674860
num_examples: 125
- name: 2008_q1_02
num_bytes: 933131
num_examples: 154
- name: 2008_q1_03
num_bytes: 1085522
num_examples: 159
- name: 2008_q2_04
num_bytes: 865368
num_examples: 159
- name: 2008_q2_05
num_bytes: 534954
num_examples: 107
- name: 2008_q2_06
num_bytes: 398185
num_examples: 86
- name: 2008_q3_07
num_bytes: 623500
num_examples: 127
- name: 2008_q3_08
num_bytes: 728361
num_examples: 147
- name: 2008_q3_09
num_bytes: 790234
num_examples: 152
- name: 2008_q4_10
num_bytes: 862265
num_examples: 180
- name: 2008_q4_11
num_bytes: 863366
num_examples: 168
- name: 2008_q4_12
num_bytes: 778605
num_examples: 168
- name: 2009_q1_01
num_bytes: 818702
num_examples: 166
- name: 2009_q1_02
num_bytes: 797837
num_examples: 143
- name: 2009_q1_03
num_bytes: 1259414
num_examples: 183
- name: 2009_q2_04
num_bytes: 1562902
num_examples: 223
- name: 2009_q2_05
num_bytes: 1764145
num_examples: 266
- name: 2009_q2_06
num_bytes: 1221522
num_examples: 216
- name: 2009_q3_07
num_bytes: 1233804
num_examples: 251
- name: 2009_q3_08
num_bytes: 1054379
num_examples: 259
- name: 2009_q3_09
num_bytes: 990231
num_examples: 220
- name: 2011_q2_04
num_bytes: 733280
num_examples: 112
- name: 2009_q4_10
num_bytes: 1512604
num_examples: 348
- name: 2009_q4_11
num_bytes: 1152167
num_examples: 242
- name: 2009_q4_12
num_bytes: 1346511
num_examples: 264
- name: 2010_q1_01
num_bytes: 1383177
num_examples: 224
- name: 2010_q1_02
num_bytes: 1249596
num_examples: 219
- name: 2010_q1_03
num_bytes: 985064
num_examples: 175
- name: 2010_q2_04
num_bytes: 1016394
num_examples: 167
- name: 2010_q2_05
num_bytes: 1017338
num_examples: 156
- name: 2010_q2_06
num_bytes: 458223
num_examples: 82
- name: 2010_q3_07
num_bytes: 527931
num_examples: 94
- name: 2010_q3_08
num_bytes: 574020
num_examples: 100
- name: 2010_q3_09
num_bytes: 924636
num_examples: 139
- name: 2010_q4_10
num_bytes: 1140983
num_examples: 238
- name: 2010_q4_11
num_bytes: 774845
num_examples: 126
- name: 2010_q4_12
num_bytes: 874801
num_examples: 131
- name: 2011_q1_01
num_bytes: 817358
num_examples: 133
- name: 2011_q1_02
num_bytes: 836968
num_examples: 147
- name: 2011_q1_03
num_bytes: 872318
num_examples: 143
- name: 2011_q2_05
num_bytes: 840141
num_examples: 137
- name: 2011_q2_06
num_bytes: 679488
num_examples: 120
- name: 2011_q3_07
num_bytes: 711887
num_examples: 68
- name: 2011_q3_08
num_bytes: 439673
num_examples: 87
- name: 2011_q3_09
num_bytes: 617362
num_examples: 99
- name: 2011_q4_10
num_bytes: 545758
num_examples: 100
- name: 2011_q4_11
num_bytes: 389018
num_examples: 80
- name: 2011_q4_12
num_bytes: 642319
num_examples: 114
- name: 2012_q1_01
num_bytes: 381459
num_examples: 96
- name: 2012_q1_02
num_bytes: 561818
num_examples: 127
- name: 2012_q1_03
num_bytes: 571910
num_examples: 107
- name: 2012_q2_04
num_bytes: 580436
num_examples: 120
- name: 2012_q2_05
num_bytes: 402850
num_examples: 82
- name: 2012_q2_06
num_bytes: 263911
num_examples: 83
- name: 2012_q3_07
num_bytes: 502952
num_examples: 103
- name: 2012_q3_08
num_bytes: 231614
num_examples: 54
- name: 2024_q2_04
num_bytes: 62770
num_examples: 1
- name: 2012_q3_09
num_bytes: 309331
num_examples: 49
- name: 2012_q4_10
num_bytes: 376524
num_examples: 84
- name: 2012_q4_11
num_bytes: 491731
num_examples: 144
- name: 2012_q4_12
num_bytes: 618924
num_examples: 148
- name: 2013_q1_01
num_bytes: 771333
num_examples: 205
- name: 2013_q1_02
num_bytes: 750653
num_examples: 178
- name: 2013_q1_03
num_bytes: 453715
num_examples: 116
- name: 2013_q2_04
num_bytes: 686098
num_examples: 163
- name: 2013_q2_05
num_bytes: 600805
num_examples: 137
- name: 2013_q2_06
num_bytes: 387056
num_examples: 126
- name: 2013_q3_07
num_bytes: 507851
num_examples: 190
- name: 2013_q3_08
num_bytes: 405500
num_examples: 138
- name: 2013_q3_09
num_bytes: 549182
num_examples: 125
- name: 2013_q4_10
num_bytes: 431982
num_examples: 111
- name: 2013_q4_11
num_bytes: 520244
num_examples: 123
- name: 2013_q4_12
num_bytes: 480457
num_examples: 125
- name: 2014_q1_01
num_bytes: 307079
num_examples: 91
- name: 2014_q1_02
num_bytes: 526961
num_examples: 140
- name: 2014_q1_03
num_bytes: 479109
num_examples: 111
- name: 2024_q1_02
num_bytes: 3983
num_examples: 1
- name: 2014_q2_04
num_bytes: 454264
num_examples: 107
- name: 2014_q2_05
num_bytes: 382490
num_examples: 100
- name: 2014_q2_06
num_bytes: 334245
num_examples: 95
- name: 2014_q3_07
num_bytes: 168277
num_examples: 71
- name: 2014_q3_08
num_bytes: 414611
num_examples: 125
- name: 2014_q3_09
num_bytes: 487561
num_examples: 130
- name: 2014_q4_10
num_bytes: 433350
num_examples: 117
- name: 2014_q4_11
num_bytes: 337759
num_examples: 105
- name: 2014_q4_12
num_bytes: 360099
num_examples: 99
- name: 2015_q1_01
num_bytes: 507363
num_examples: 137
- name: 2015_q1_02
num_bytes: 499253
num_examples: 132
- name: 2015_q1_03
num_bytes: 765330
num_examples: 161
- name: 2015_q3_09
num_bytes: 371293
num_examples: 97
- name: 2015_q2_04
num_bytes: 663231
num_examples: 142
- name: 2015_q2_05
num_bytes: 855676
num_examples: 201
- name: 2015_q2_06
num_bytes: 560424
num_examples: 164
- name: 2016_q3_08
num_bytes: 552588
num_examples: 134
- name: 2015_q3_07
num_bytes: 268777
num_examples: 90
- name: 2015_q3_08
num_bytes: 298843
num_examples: 87
- name: 2015_q4_10
num_bytes: 410433
num_examples: 111
- name: 2015_q4_11
num_bytes: 432560
num_examples: 114
- name: 2015_q4_12
num_bytes: 379851
num_examples: 97
- name: 2016_q1_01
num_bytes: 401621
num_examples: 100
- name: 2016_q1_02
num_bytes: 429242
num_examples: 104
- name: 2016_q1_03
num_bytes: 397313
num_examples: 102
- name: 2016_q2_04
num_bytes: 340370
num_examples: 80
- name: 2016_q2_05
num_bytes: 424694
num_examples: 94
- name: 2016_q2_06
num_bytes: 288428
num_examples: 90
- name: 2016_q3_07
num_bytes: 188716
num_examples: 69
- name: 2016_q3_09
num_bytes: 435797
num_examples: 100
- name: 2020_q1_01
num_bytes: 196173
num_examples: 57
- name: 2016_q4_10
num_bytes: 496955
num_examples: 123
- name: 2016_q4_11
num_bytes: 614857
num_examples: 159
- name: 2016_q4_12
num_bytes: 400196
num_examples: 110
- name: 2017_q1_01
num_bytes: 664391
num_examples: 149
- name: 2017_q1_02
num_bytes: 576620
num_examples: 114
- name: 2017_q1_03
num_bytes: 610612
num_examples: 139
- name: 2017_q2_04
num_bytes: 590644
num_examples: 131
- name: 2017_q2_05
num_bytes: 364864
num_examples: 87
- name: 2017_q2_06
num_bytes: 106747
num_examples: 51
- name: 2017_q3_07
num_bytes: 75717
num_examples: 42
- name: 2017_q3_08
num_bytes: 169537
num_examples: 52
- name: 2017_q3_09
num_bytes: 337750
num_examples: 82
- name: 2017_q4_10
num_bytes: 303349
num_examples: 79
- name: 2017_q4_11
num_bytes: 215308
num_examples: 62
- name: 2017_q4_12
num_bytes: 233482
num_examples: 59
- name: 2018_q1_01
num_bytes: 395422
num_examples: 72
- name: 2018_q1_02
num_bytes: 349279
num_examples: 82
- name: 2018_q1_03
num_bytes: 457864
num_examples: 125
- name: 2018_q2_04
num_bytes: 358932
num_examples: 84
- name: 2018_q2_05
num_bytes: 275477
num_examples: 70
- name: 2018_q2_06
num_bytes: 79726
num_examples: 43
- name: 2018_q3_07
num_bytes: 108459
num_examples: 48
- name: 2018_q3_08
num_bytes: 194753
num_examples: 54
- name: 2018_q3_09
num_bytes: 355606
num_examples: 74
- name: 2018_q4_10
num_bytes: 325717
num_examples: 68
- name: 2018_q4_11
num_bytes: 291465
num_examples: 64
- name: 2018_q4_12
num_bytes: 1242103
num_examples: 109
- name: 2019_q1_01
num_bytes: 1734577
num_examples: 102
- name: 2019_q1_02
num_bytes: 1142460
num_examples: 92
- name: 2019_q1_03
num_bytes: 299683
num_examples: 81
- name: 2019_q2_04
num_bytes: 333749
num_examples: 78
- name: 2019_q2_05
num_bytes: 272379
num_examples: 69
- name: 2019_q2_06
num_bytes: 95724
num_examples: 46
- name: 2019_q3_07
num_bytes: 166013
num_examples: 66
- name: 2019_q3_08
num_bytes: 162151
num_examples: 54
- name: 2019_q3_09
num_bytes: 476161
num_examples: 97
- name: 2019_q4_10
num_bytes: 378725
num_examples: 86
- name: 2019_q4_11
num_bytes: 256965
num_examples: 66
- name: 2019_q4_12
num_bytes: 267249
num_examples: 67
- name: 2020_q1_02
num_bytes: 348236
num_examples: 82
- name: 2020_q1_03
num_bytes: 871964
num_examples: 281
- name: 2020_q3_09
num_bytes: 238777
num_examples: 67
- name: 2020_q2_04
num_bytes: 451877
num_examples: 100
- name: 2020_q2_05
num_bytes: 691879
num_examples: 136
- name: 2020_q2_06
num_bytes: 280122
num_examples: 70
- name: 2020_q3_07
num_bytes: 212583
num_examples: 68
- name: 2020_q3_08
num_bytes: 133162
num_examples: 53
- name: 2020_q4_10
num_bytes: 273716
num_examples: 47
- name: 2020_q4_11
num_bytes: 307339
num_examples: 51
- name: 2020_q4_12
num_bytes: 405866
num_examples: 63
- name: 2021_q1_01
num_bytes: 414243
num_examples: 57
- name: 2021_q1_02
num_bytes: 297417
num_examples: 42
- name: 2021_q1_03
num_bytes: 254278
num_examples: 39
- name: 2021_q2_04
num_bytes: 223601
num_examples: 37
- name: 2021_q2_05
num_bytes: 380467
num_examples: 72
- name: 2021_q2_06
num_bytes: 167973
num_examples: 37
- name: 2021_q3_07
num_bytes: 104195
num_examples: 24
- name: 2021_q3_08
num_bytes: 134256
num_examples: 24
- name: 2021_q3_09
num_bytes: 238820
num_examples: 33
- name: 2021_q4_10
num_bytes: 247669
num_examples: 34
- name: 2021_q4_11
num_bytes: 219545
num_examples: 30
- name: 2021_q4_12
num_bytes: 195579
num_examples: 25
- name: 2022_q1_01
num_bytes: 50719
num_examples: 10
- name: 2022_q1_02
num_bytes: 191644
num_examples: 45
- name: 2022_q1_03
num_bytes: 117225
num_examples: 20
- name: 2022_q2_05
num_bytes: 42334
num_examples: 10
- name: 2022_q2_04
num_bytes: 58154
num_examples: 11
- name: 2022_q2_06
num_bytes: 47616
num_examples: 11
- name: 2022_q3_07
num_bytes: 64975
num_examples: 12
- name: 2022_q3_08
num_bytes: 266266
num_examples: 29
- name: 2022_q3_09
num_bytes: 906661
num_examples: 85
- name: 2022_q4_12
num_bytes: 28612
num_examples: 7
- name: 2022_q4_10
num_bytes: 355137
num_examples: 39
- name: 2022_q4_11
num_bytes: 23375
num_examples: 3
- name: 2023_q1_01
num_bytes: 50750
num_examples: 14
- name: 2023_q1_03
num_bytes: 107013
num_examples: 15
- name: 2023_q1_02
num_bytes: 57972
num_examples: 13
- name: 2023_q2_04
num_bytes: 148470
num_examples: 34
- name: 2023_q2_05
num_bytes: 119025
num_examples: 24
- name: 2023_q2_06
num_bytes: 266191
num_examples: 53
- name: 2023_q3_07
num_bytes: 360919
num_examples: 62
- name: 2023_q3_08
num_bytes: 68572
num_examples: 15
- name: 2023_q3_09
num_bytes: 78522
num_examples: 20
- name: 2023_q4_10
num_bytes: 145124
num_examples: 27
- name: 2023_q4_11
num_bytes: 84852
num_examples: 13
- name: 2023_q4_12
num_bytes: 135577
num_examples: 22
- name: 2024_q1_01
num_bytes: 60657
num_examples: 13
download_size: 57608409
dataset_size: 122318263
- config_name: it
features:
- name: language
dtype: string
- name: wiki_page_id
dtype: string
- name: wiki_revision_id
dtype: string
- name: revision_timestamp
dtype: timestamp[us, tz=UTC]
- name: revision_year
dtype: uint16
- name: revision_month
dtype: uint16
- name: article_timestamp
dtype: timestamp[us, tz=UTC]
- name: article_year
dtype: uint16
- name: article_month
dtype: uint16
- name: url
dtype: string
- name: title
dtype: string
- name: raw_text
dtype: string
- name: cleaned_text
dtype: string
- name: categories
sequence: string
- name: sources
sequence: string
- name: dump
dtype: string
splits:
- name: 2005_q1_03
num_bytes: 17498
num_examples: 7
- name: 2005_q2_04
num_bytes: 261191
num_examples: 83
- name: no_date
num_bytes: 5646989
num_examples: 1788
- name: 2005_q2_05
num_bytes: 166687
num_examples: 44
- name: 2006_q1_02
num_bytes: 761513
num_examples: 156
- name: 2005_q2_06
num_bytes: 124553
num_examples: 35
- name: 2005_q3_07
num_bytes: 306323
num_examples: 79
- name: 2005_q3_09
num_bytes: 396624
num_examples: 128
- name: 2005_q4_10
num_bytes: 565051
num_examples: 215
- name: 2005_q3_08
num_bytes: 395021
num_examples: 148
- name: 2005_q4_11
num_bytes: 400822
num_examples: 133
- name: 2005_q4_12
num_bytes: 377971
num_examples: 133
- name: 2006_q1_01
num_bytes: 477483
num_examples: 174
- name: 2006_q2_04
num_bytes: 654896
num_examples: 195
- name: 2006_q2_05
num_bytes: 467410
num_examples: 97
- name: 2005_q1_01
num_bytes: 1306
num_examples: 1
- name: 2006_q1_03
num_bytes: 471988
num_examples: 131
- name: 2007_q3_08
num_bytes: 512832
num_examples: 126
- name: 2007_q4_10
num_bytes: 1287037
num_examples: 267
- name: 2006_q2_06
num_bytes: 249693
num_examples: 68
- name: 2006_q3_09
num_bytes: 565214
num_examples: 90
- name: 2006_q4_10
num_bytes: 437569
num_examples: 142
- name: 2006_q4_11
num_bytes: 431840
num_examples: 111
- name: 2006_q4_12
num_bytes: 527187
num_examples: 167
- name: 2007_q1_01
num_bytes: 641726
num_examples: 176
- name: 2007_q1_02
num_bytes: 899325
num_examples: 200
- name: 2007_q1_03
num_bytes: 769721
num_examples: 165
- name: 2007_q2_04
num_bytes: 605399
num_examples: 161
- name: 2006_q3_07
num_bytes: 633304
num_examples: 177
- name: 2007_q2_05
num_bytes: 577901
num_examples: 117
- name: 2007_q2_06
num_bytes: 360770
num_examples: 88
- name: 2006_q3_08
num_bytes: 658208
num_examples: 170
- name: 2007_q3_07
num_bytes: 375414
num_examples: 95
- name: 2007_q3_09
num_bytes: 1040794
num_examples: 222
- name: 2008_q1_03
num_bytes: 653124
num_examples: 125
- name: 2008_q1_02
num_bytes: 583790
num_examples: 99
- name: 2007_q4_11
num_bytes: 664301
num_examples: 154
- name: 2007_q4_12
num_bytes: 773904
num_examples: 158
- name: 2008_q1_01
num_bytes: 793518
num_examples: 174
- name: 2008_q2_06
num_bytes: 1043018
num_examples: 166
- name: 2008_q2_04
num_bytes: 709803
num_examples: 124
- name: 2008_q2_05
num_bytes: 864891
num_examples: 128
- name: 2008_q3_07
num_bytes: 765773
num_examples: 161
- name: 2009_q3_08
num_bytes: 345152
num_examples: 87
- name: 2008_q3_08
num_bytes: 1120831
num_examples: 239
- name: 2008_q3_09
num_bytes: 947788
num_examples: 204
- name: 2008_q4_10
num_bytes: 662947
num_examples: 146
- name: 2008_q4_11
num_bytes: 510399
num_examples: 137
- name: 2008_q4_12
num_bytes: 525476
num_examples: 128
- name: 2009_q1_01
num_bytes: 265353
num_examples: 62
- name: 2009_q1_03
num_bytes: 177725
num_examples: 33
- name: 2009_q1_02
num_bytes: 249695
num_examples: 59
- name: 2012_q3_08
num_bytes: 334614
num_examples: 62
- name: 2009_q2_04
num_bytes: 287598
num_examples: 36
- name: 2009_q2_05
num_bytes: 176110
num_examples: 21
- name: 2009_q2_06
num_bytes: 116787
num_examples: 22
- name: 2009_q3_07
num_bytes: 233441
num_examples: 36
- name: 2009_q3_09
num_bytes: 228219
num_examples: 63
- name: 2009_q4_12
num_bytes: 193961
num_examples: 41
- name: 2009_q4_10
num_bytes: 228662
num_examples: 58
- name: 2009_q4_11
num_bytes: 170709
num_examples: 43
- name: 2010_q1_01
num_bytes: 143885
num_examples: 26
- name: 2010_q1_02
num_bytes: 300295
num_examples: 19
- name: 2010_q1_03
num_bytes: 103879
num_examples: 20
- name: 2010_q2_04
num_bytes: 122737
num_examples: 23
- name: 2010_q2_05
num_bytes: 262961
num_examples: 20
- name: 2010_q2_06
num_bytes: 624705
num_examples: 62
- name: 2010_q3_07
num_bytes: 428019
num_examples: 43
- name: 2010_q3_08
num_bytes: 120926
num_examples: 23
- name: 2010_q3_09
num_bytes: 299829
num_examples: 63
- name: 2010_q4_10
num_bytes: 312406
num_examples: 45
- name: 2010_q4_11
num_bytes: 247291
num_examples: 33
- name: 2010_q4_12
num_bytes: 93335
num_examples: 23
- name: 2011_q1_01
num_bytes: 100483
num_examples: 23
- name: 2011_q1_02
num_bytes: 62456
num_examples: 14
- name: 2011_q1_03
num_bytes: 97232
num_examples: 17
- name: 2011_q2_04
num_bytes: 50141
num_examples: 4
- name: 2011_q2_05
num_bytes: 476339
num_examples: 34
- name: 2011_q2_06
num_bytes: 83372
num_examples: 21
- name: 2011_q3_07
num_bytes: 112684
num_examples: 16
- name: 2011_q3_08
num_bytes: 155086
num_examples: 18
- name: 2011_q3_09
num_bytes: 233022
num_examples: 33
- name: 2011_q4_10
num_bytes: 310714
num_examples: 49
- name: 2011_q4_11
num_bytes: 272901
num_examples: 24
- name: 2011_q4_12
num_bytes: 151179
num_examples: 14
- name: 2012_q1_01
num_bytes: 195281
num_examples: 19
- name: 2012_q1_02
num_bytes: 273277
num_examples: 32
- name: 2012_q1_03
num_bytes: 258218
num_examples: 31
- name: 2012_q2_04
num_bytes: 393185
num_examples: 59
- name: 2012_q2_05
num_bytes: 420244
num_examples: 51
- name: 2012_q2_06
num_bytes: 159958
num_examples: 28
- name: 2012_q3_07
num_bytes: 169536
num_examples: 34
- name: 2012_q3_09
num_bytes: 336696
num_examples: 41
- name: 2012_q4_10
num_bytes: 245075
num_examples: 39
- name: 2012_q4_11
num_bytes: 155775
num_examples: 20
- name: 2012_q4_12
num_bytes: 197760
num_examples: 22
- name: 2013_q1_01
num_bytes: 139118
num_examples: 19
- name: 2013_q1_02
num_bytes: 254778
num_examples: 30
- name: 2013_q1_03
num_bytes: 201072
num_examples: 21
- name: 2013_q2_04
num_bytes: 250915
num_examples: 24
- name: 2013_q2_05
num_bytes: 159330
num_examples: 19
- name: 2013_q2_06
num_bytes: 22959
num_examples: 2
- name: 2013_q3_07
num_bytes: 38234
num_examples: 7
- name: 2013_q3_08
num_bytes: 44716
num_examples: 7
- name: 2013_q3_09
num_bytes: 234677
num_examples: 29
- name: 2013_q4_10
num_bytes: 380476
num_examples: 41
- name: 2013_q4_11
num_bytes: 406701
num_examples: 42
- name: 2013_q4_12
num_bytes: 416890
num_examples: 41
- name: 2014_q1_01
num_bytes: 323249
num_examples: 37
- name: 2014_q1_02
num_bytes: 358984
num_examples: 47
- name: 2014_q1_03
num_bytes: 458070
num_examples: 57
- name: 2014_q2_04
num_bytes: 324033
num_examples: 39
- name: 2014_q2_05
num_bytes: 284361
num_examples: 32
- name: 2014_q2_06
num_bytes: 148852
num_examples: 33
- name: 2014_q3_07
num_bytes: 75517
num_examples: 10
- name: 2014_q4_12
num_bytes: 321044
num_examples: 31
- name: 2014_q3_08
num_bytes: 71993
num_examples: 11
- name: 2014_q3_09
num_bytes: 177184
num_examples: 17
- name: 2014_q4_10
num_bytes: 269117
num_examples: 31
- name: 2014_q4_11
num_bytes: 249397
num_examples: 21
- name: 2015_q1_01
num_bytes: 292103
num_examples: 28
- name: 2015_q1_02
num_bytes: 220753
num_examples: 21
- name: 2015_q1_03
num_bytes: 293867
num_examples: 25
- name: 2015_q2_04
num_bytes: 301852
num_examples: 32
- name: 2015_q2_05
num_bytes: 311806
num_examples: 38
- name: 2015_q2_06
num_bytes: 122176
num_examples: 24
- name: 2015_q3_07
num_bytes: 79969
num_examples: 8
- name: 2016_q1_02
num_bytes: 468995
num_examples: 23
- name: 2015_q3_08
num_bytes: 124990
num_examples: 15
- name: 2015_q3_09
num_bytes: 195157
num_examples: 18
- name: 2015_q4_10
num_bytes: 223206
num_examples: 21
- name: 2015_q4_11
num_bytes: 234178
num_examples: 23
- name: 2015_q4_12
num_bytes: 219161
num_examples: 20
- name: 2016_q1_01
num_bytes: 170589
num_examples: 15
- name: 2016_q3_09
num_bytes: 167572
num_examples: 24
- name: 2016_q1_03
num_bytes: 259776
num_examples: 18
- name: 2016_q2_04
num_bytes: 277201
num_examples: 27
- name: 2016_q2_05
num_bytes: 187354
num_examples: 23
- name: 2016_q2_06
num_bytes: 67792
num_examples: 13
- name: 2016_q3_07
num_bytes: 169261
num_examples: 47
- name: 2016_q3_08
num_bytes: 333480
num_examples: 81
- name: 2018_q4_12
num_bytes: 182860
num_examples: 20
- name: 2016_q4_10
num_bytes: 238793
num_examples: 30
- name: 2016_q4_11
num_bytes: 156812
num_examples: 13
- name: 2016_q4_12
num_bytes: 162924
num_examples: 20
- name: 2017_q1_01
num_bytes: 150774
num_examples: 26
- name: 2017_q1_02
num_bytes: 84471
num_examples: 25
- name: 2017_q1_03
num_bytes: 50747
num_examples: 9
- name: 2017_q2_04
num_bytes: 1895
num_examples: 1
- name: 2017_q2_05
num_bytes: 55209
num_examples: 20
- name: 2017_q2_06
num_bytes: 27869
num_examples: 9
- name: 2017_q3_07
num_bytes: 36470
num_examples: 10
- name: 2017_q3_08
num_bytes: 58854
num_examples: 6
- name: 2017_q3_09
num_bytes: 104351
num_examples: 12
- name: 2017_q4_10
num_bytes: 157693
num_examples: 13
- name: 2017_q4_11
num_bytes: 105643
num_examples: 9
- name: 2017_q4_12
num_bytes: 131793
num_examples: 10
- name: 2018_q1_01
num_bytes: 71735
num_examples: 6
- name: 2018_q1_02
num_bytes: 130998
num_examples: 12
- name: 2018_q1_03
num_bytes: 135903
num_examples: 17
- name: 2018_q2_04
num_bytes: 184133
num_examples: 17
- name: 2018_q2_05
num_bytes: 100476
num_examples: 9
- name: 2018_q2_06
num_bytes: 21262
num_examples: 4
- name: 2018_q3_08
num_bytes: 156752
num_examples: 45
- name: 2018_q3_09
num_bytes: 266025
num_examples: 53
- name: 2018_q3_07
num_bytes: 5925
num_examples: 1
- name: 2018_q4_10
num_bytes: 482188
num_examples: 80
- name: 2018_q4_11
num_bytes: 692275
num_examples: 112
- name: 2019_q1_01
num_bytes: 100820
num_examples: 14
- name: 2019_q1_02
num_bytes: 152197
num_examples: 18
- name: 2019_q1_03
num_bytes: 118947
num_examples: 14
- name: 2019_q2_04
num_bytes: 173835
num_examples: 16
- name: 2019_q2_05
num_bytes: 120055
num_examples: 12
- name: 2019_q2_06
num_bytes: 38955
num_examples: 6
- name: 2019_q3_07
num_bytes: 23181
num_examples: 4
- name: 2019_q3_08
num_bytes: 210869
num_examples: 25
- name: 2019_q3_09
num_bytes: 38451
num_examples: 3
- name: 2019_q4_10
num_bytes: 33498
num_examples: 8
- name: 2019_q4_11
num_bytes: 98385
num_examples: 26
- name: 2019_q4_12
num_bytes: 46345
num_examples: 11
- name: 2020_q1_01
num_bytes: 55326
num_examples: 13
- name: 2020_q1_02
num_bytes: 29672
num_examples: 7
- name: 2020_q1_03
num_bytes: 103939
num_examples: 21
- name: 2020_q2_04
num_bytes: 128535
num_examples: 38
- name: 2020_q2_05
num_bytes: 416383
num_examples: 107
- name: 2020_q2_06
num_bytes: 348530
num_examples: 74
- name: 2020_q3_07
num_bytes: 42660
num_examples: 11
- name: 2020_q3_08
num_bytes: 49823
num_examples: 12
- name: 2020_q3_09
num_bytes: 48014
num_examples: 9
- name: 2020_q4_10
num_bytes: 129677
num_examples: 34
- name: 2020_q4_11
num_bytes: 117229
num_examples: 28
- name: 2020_q4_12
num_bytes: 36880
num_examples: 9
- name: 2021_q1_01
num_bytes: 78063
num_examples: 20
- name: 2021_q1_02
num_bytes: 86594
num_examples: 25
- name: 2021_q1_03
num_bytes: 68001
num_examples: 16
- name: 2021_q2_04
num_bytes: 22768
num_examples: 6
- name: 2021_q2_05
num_bytes: 120363
num_examples: 30
- name: 2021_q2_06
num_bytes: 41382
num_examples: 7
- name: 2021_q3_07
num_bytes: 72451
num_examples: 21
- name: 2021_q3_08
num_bytes: 20147
num_examples: 5
- name: 2021_q3_09
num_bytes: 35675
num_examples: 7
- name: 2021_q4_10
num_bytes: 10702
num_examples: 3
- name: 2021_q4_11
num_bytes: 8137
num_examples: 3
- name: 2021_q4_12
num_bytes: 9426
num_examples: 3
- name: 2022_q1_01
num_bytes: 36827
num_examples: 9
- name: 2022_q1_02
num_bytes: 87731
num_examples: 18
- name: 2022_q1_03
num_bytes: 45245
num_examples: 11
- name: 2022_q2_04
num_bytes: 39204
num_examples: 11
- name: 2022_q2_05
num_bytes: 101670
num_examples: 32
- name: 2022_q2_06
num_bytes: 22395
num_examples: 6
- name: 2022_q3_07
num_bytes: 48899
num_examples: 19
- name: 2022_q3_08
num_bytes: 125526
num_examples: 24
- name: 2022_q3_09
num_bytes: 235913
num_examples: 45
- name: 2022_q4_10
num_bytes: 375576
num_examples: 46
- name: 2022_q4_11
num_bytes: 187025
num_examples: 24
- name: 2022_q4_12
num_bytes: 176688
num_examples: 49
- name: 2023_q1_01
num_bytes: 234129
num_examples: 41
- name: 2023_q1_02
num_bytes: 213016
num_examples: 33
- name: 2023_q1_03
num_bytes: 195923
num_examples: 29
- name: 2023_q2_04
num_bytes: 169570
num_examples: 26
- name: 2023_q2_05
num_bytes: 136962
num_examples: 16
- name: 2023_q2_06
num_bytes: 83646
num_examples: 17
- name: 2023_q3_07
num_bytes: 18021
num_examples: 6
- name: 2023_q3_08
num_bytes: 83993
num_examples: 17
- name: 2023_q3_09
num_bytes: 186950
num_examples: 31
- name: 2023_q4_10
num_bytes: 187116
num_examples: 42
- name: 2023_q4_11
num_bytes: 169804
num_examples: 25
- name: 2023_q4_12
num_bytes: 164520
num_examples: 24
- name: 2024_q1_01
num_bytes: 131163
num_examples: 22
download_size: 35371694
dataset_size: 64646893
configs:
- config_name: de
data_files:
- split: 2004_q4_12
path: de/2004_q4_12-*
- split: 2005_q1_01
path: de/2005_q1_01-*
- split: 2005_q1_02
path: de/2005_q1_02-*
- split: 2005_q1_03
path: de/2005_q1_03-*
- split: 2005_q3_08
path: de/2005_q3_08-*
- split: 2005_q2_04
path: de/2005_q2_04-*
- split: 2005_q2_05
path: de/2005_q2_05-*
- split: 2005_q3_07
path: de/2005_q3_07-*
- split: 2005_q2_06
path: de/2005_q2_06-*
- split: 2005_q4_10
path: de/2005_q4_10-*
- split: 2005_q4_11
path: de/2005_q4_11-*
- split: 2007_q1_03
path: de/2007_q1_03-*
- split: 2005_q3_09
path: de/2005_q3_09-*
- split: 2004_q3_08
path: de/2004_q3_08-*
- split: 2005_q4_12
path: de/2005_q4_12-*
- split: 2006_q1_01
path: de/2006_q1_01-*
- split: 2006_q1_02
path: de/2006_q1_02-*
- split: 2006_q1_03
path: de/2006_q1_03-*
- split: 2006_q2_06
path: de/2006_q2_06-*
- split: 2006_q2_04
path: de/2006_q2_04-*
- split: 2006_q2_05
path: de/2006_q2_05-*
- split: 2006_q3_07
path: de/2006_q3_07-*
- split: 2006_q3_08
path: de/2006_q3_08-*
- split: 2006_q3_09
path: de/2006_q3_09-*
- split: 2006_q4_10
path: de/2006_q4_10-*
- split: 2006_q4_11
path: de/2006_q4_11-*
- split: 2006_q4_12
path: de/2006_q4_12-*
- split: 2007_q1_02
path: de/2007_q1_02-*
- split: 2007_q1_01
path: de/2007_q1_01-*
- split: 2007_q2_06
path: de/2007_q2_06-*
- split: 2007_q2_04
path: de/2007_q2_04-*
- split: 2007_q2_05
path: de/2007_q2_05-*
- split: 2007_q3_07
path: de/2007_q3_07-*
- split: 2007_q3_08
path: de/2007_q3_08-*
- split: 2007_q3_09
path: de/2007_q3_09-*
- split: 2007_q4_10
path: de/2007_q4_10-*
- split: 2007_q4_11
path: de/2007_q4_11-*
- split: 2007_q4_12
path: de/2007_q4_12-*
- split: 2008_q1_01
path: de/2008_q1_01-*
- split: 2008_q1_02
path: de/2008_q1_02-*
- split: 2008_q1_03
path: de/2008_q1_03-*
- split: 2008_q2_04
path: de/2008_q2_04-*
- split: 2008_q2_05
path: de/2008_q2_05-*
- split: 2008_q2_06
path: de/2008_q2_06-*
- split: 2008_q3_07
path: de/2008_q3_07-*
- split: 2008_q3_08
path: de/2008_q3_08-*
- split: 2008_q3_09
path: de/2008_q3_09-*
- split: 2008_q4_10
path: de/2008_q4_10-*
- split: 2008_q4_11
path: de/2008_q4_11-*
- split: 2008_q4_12
path: de/2008_q4_12-*
- split: 2009_q1_01
path: de/2009_q1_01-*
- split: 2009_q1_02
path: de/2009_q1_02-*
- split: 2009_q1_03
path: de/2009_q1_03-*
- split: 2009_q2_04
path: de/2009_q2_04-*
- split: 2009_q2_05
path: de/2009_q2_05-*
- split: 2009_q2_06
path: de/2009_q2_06-*
- split: 2009_q3_07
path: de/2009_q3_07-*
- split: 2009_q3_08
path: de/2009_q3_08-*
- split: 2009_q3_09
path: de/2009_q3_09-*
- split: 2009_q4_10
path: de/2009_q4_10-*
- split: 2009_q4_11
path: de/2009_q4_11-*
- split: 2009_q4_12
path: de/2009_q4_12-*
- split: 2010_q1_01
path: de/2010_q1_01-*
- split: 2010_q1_02
path: de/2010_q1_02-*
- split: 2010_q1_03
path: de/2010_q1_03-*
- split: 2010_q2_04
path: de/2010_q2_04-*
- split: 2010_q2_05
path: de/2010_q2_05-*
- split: 2010_q2_06
path: de/2010_q2_06-*
- split: 2010_q3_07
path: de/2010_q3_07-*
- split: 2010_q3_08
path: de/2010_q3_08-*
- split: 2010_q3_09
path: de/2010_q3_09-*
- split: 2010_q4_10
path: de/2010_q4_10-*
- split: 2010_q4_11
path: de/2010_q4_11-*
- split: 2010_q4_12
path: de/2010_q4_12-*
- split: 2011_q1_01
path: de/2011_q1_01-*
- split: 2011_q1_02
path: de/2011_q1_02-*
- split: 2011_q1_03
path: de/2011_q1_03-*
- split: 2011_q2_04
path: de/2011_q2_04-*
- split: 2011_q2_05
path: de/2011_q2_05-*
- split: 2011_q2_06
path: de/2011_q2_06-*
- split: 2011_q3_07
path: de/2011_q3_07-*
- split: 2011_q3_08
path: de/2011_q3_08-*
- split: 2011_q3_09
path: de/2011_q3_09-*
- split: 2011_q4_10
path: de/2011_q4_10-*
- split: 2011_q4_11
path: de/2011_q4_11-*
- split: 2011_q4_12
path: de/2011_q4_12-*
- split: 2012_q1_01
path: de/2012_q1_01-*
- split: 2012_q1_02
path: de/2012_q1_02-*
- split: 2012_q1_03
path: de/2012_q1_03-*
- split: 2012_q2_04
path: de/2012_q2_04-*
- split: 2012_q2_05
path: de/2012_q2_05-*
- split: 2012_q2_06
path: de/2012_q2_06-*
- split: 2012_q3_07
path: de/2012_q3_07-*
- split: 2012_q3_08
path: de/2012_q3_08-*
- split: 2012_q3_09
path: de/2012_q3_09-*
- split: 2012_q4_10
path: de/2012_q4_10-*
- split: 2012_q4_11
path: de/2012_q4_11-*
- split: 2012_q4_12
path: de/2012_q4_12-*
- split: 2013_q1_01
path: de/2013_q1_01-*
- split: no_date
path: de/no_date-*
- split: 2013_q1_02
path: de/2013_q1_02-*
- split: 2013_q1_03
path: de/2013_q1_03-*
- split: 2013_q2_04
path: de/2013_q2_04-*
- split: 2013_q2_05
path: de/2013_q2_05-*
- split: 2013_q2_06
path: de/2013_q2_06-*
- split: 2013_q3_07
path: de/2013_q3_07-*
- split: 2013_q3_09
path: de/2013_q3_09-*
- split: 2013_q3_08
path: de/2013_q3_08-*
- split: 2013_q4_10
path: de/2013_q4_10-*
- split: 2013_q4_11
path: de/2013_q4_11-*
- split: 2013_q4_12
path: de/2013_q4_12-*
- split: 2014_q1_01
path: de/2014_q1_01-*
- split: 2014_q1_02
path: de/2014_q1_02-*
- split: 2014_q1_03
path: de/2014_q1_03-*
- split: 2014_q2_04
path: de/2014_q2_04-*
- split: 2014_q2_05
path: de/2014_q2_05-*
- split: 2014_q2_06
path: de/2014_q2_06-*
- split: 2014_q3_07
path: de/2014_q3_07-*
- split: 2014_q3_08
path: de/2014_q3_08-*
- split: 2014_q3_09
path: de/2014_q3_09-*
- split: 2014_q4_10
path: de/2014_q4_10-*
- split: 2014_q4_11
path: de/2014_q4_11-*
- split: 2014_q4_12
path: de/2014_q4_12-*
- split: 2015_q1_01
path: de/2015_q1_01-*
- split: 2015_q1_02
path: de/2015_q1_02-*
- split: 2015_q1_03
path: de/2015_q1_03-*
- split: 2015_q2_04
path: de/2015_q2_04-*
- split: 2015_q2_06
path: de/2015_q2_06-*
- split: 2015_q2_05
path: de/2015_q2_05-*
- split: 2015_q3_07
path: de/2015_q3_07-*
- split: 2015_q3_08
path: de/2015_q3_08-*
- split: 2015_q3_09
path: de/2015_q3_09-*
- split: 2015_q4_10
path: de/2015_q4_10-*
- split: 2015_q4_11
path: de/2015_q4_11-*
- split: 2015_q4_12
path: de/2015_q4_12-*
- split: 2016_q1_02
path: de/2016_q1_02-*
- split: 2016_q1_01
path: de/2016_q1_01-*
- split: 2016_q1_03
path: de/2016_q1_03-*
- split: 2016_q2_05
path: de/2016_q2_05-*
- split: 2016_q2_04
path: de/2016_q2_04-*
- split: 2016_q2_06
path: de/2016_q2_06-*
- split: 2016_q3_07
path: de/2016_q3_07-*
- split: 2016_q3_08
path: de/2016_q3_08-*
- split: 2016_q3_09
path: de/2016_q3_09-*
- split: 2016_q4_10
path: de/2016_q4_10-*
- split: 2016_q4_12
path: de/2016_q4_12-*
- split: 2016_q4_11
path: de/2016_q4_11-*
- split: 2017_q1_01
path: de/2017_q1_01-*
- split: 2017_q1_02
path: de/2017_q1_02-*
- split: 2017_q1_03
path: de/2017_q1_03-*
- split: 2017_q2_04
path: de/2017_q2_04-*
- split: 2017_q2_05
path: de/2017_q2_05-*
- split: 2017_q2_06
path: de/2017_q2_06-*
- split: 2017_q3_07
path: de/2017_q3_07-*
- split: 2017_q3_08
path: de/2017_q3_08-*
- split: 2017_q3_09
path: de/2017_q3_09-*
- split: 2017_q4_11
path: de/2017_q4_11-*
- split: 2017_q4_10
path: de/2017_q4_10-*
- split: 2017_q4_12
path: de/2017_q4_12-*
- split: 2018_q1_01
path: de/2018_q1_01-*
- split: 2018_q1_02
path: de/2018_q1_02-*
- split: 2018_q1_03
path: de/2018_q1_03-*
- split: 2018_q2_04
path: de/2018_q2_04-*
- split: 2018_q2_05
path: de/2018_q2_05-*
- split: 2018_q2_06
path: de/2018_q2_06-*
- split: 2018_q3_07
path: de/2018_q3_07-*
- split: 2018_q3_08
path: de/2018_q3_08-*
- split: 2018_q3_09
path: de/2018_q3_09-*
- split: 2018_q4_10
path: de/2018_q4_10-*
- split: 2018_q4_11
path: de/2018_q4_11-*
- split: 2018_q4_12
path: de/2018_q4_12-*
- split: 2019_q1_01
path: de/2019_q1_01-*
- split: 2019_q1_02
path: de/2019_q1_02-*
- split: 2019_q1_03
path: de/2019_q1_03-*
- split: 2019_q2_04
path: de/2019_q2_04-*
- split: 2019_q2_05
path: de/2019_q2_05-*
- split: 2019_q2_06
path: de/2019_q2_06-*
- split: 2019_q3_07
path: de/2019_q3_07-*
- split: 2019_q3_08
path: de/2019_q3_08-*
- split: 2019_q3_09
path: de/2019_q3_09-*
- split: 2019_q4_10
path: de/2019_q4_10-*
- split: 2019_q4_11
path: de/2019_q4_11-*
- split: 2019_q4_12
path: de/2019_q4_12-*
- split: 2020_q1_01
path: de/2020_q1_01-*
- split: 2020_q1_02
path: de/2020_q1_02-*
- split: 2021_q1_01
path: de/2021_q1_01-*
- split: 2020_q1_03
path: de/2020_q1_03-*
- split: 2020_q2_04
path: de/2020_q2_04-*
- split: 2020_q2_05
path: de/2020_q2_05-*
- split: 2020_q2_06
path: de/2020_q2_06-*
- split: 2020_q3_07
path: de/2020_q3_07-*
- split: 2020_q3_08
path: de/2020_q3_08-*
- split: 2020_q4_10
path: de/2020_q4_10-*
- split: 2020_q4_11
path: de/2020_q4_11-*
- split: 2021_q2_06
path: de/2021_q2_06-*
- split: 2021_q1_02
path: de/2021_q1_02-*
- split: 2021_q1_03
path: de/2021_q1_03-*
- split: 2021_q2_04
path: de/2021_q2_04-*
- split: 2021_q2_05
path: de/2021_q2_05-*
- split: 2021_q3_08
path: de/2021_q3_08-*
- split: 2021_q3_09
path: de/2021_q3_09-*
- split: 2021_q4_10
path: de/2021_q4_10-*
- split: 2021_q4_11
path: de/2021_q4_11-*
- split: 2021_q4_12
path: de/2021_q4_12-*
- split: 2022_q1_01
path: de/2022_q1_01-*
- split: 2022_q1_02
path: de/2022_q1_02-*
- split: 2022_q1_03
path: de/2022_q1_03-*
- split: 2022_q2_04
path: de/2022_q2_04-*
- split: 2022_q2_05
path: de/2022_q2_05-*
- split: 2022_q2_06
path: de/2022_q2_06-*
- split: 2022_q3_07
path: de/2022_q3_07-*
- split: 2022_q3_08
path: de/2022_q3_08-*
- split: 2022_q3_09
path: de/2022_q3_09-*
- split: 2022_q4_10
path: de/2022_q4_10-*
- split: 2022_q4_11
path: de/2022_q4_11-*
- split: 2022_q4_12
path: de/2022_q4_12-*
- split: 2023_q1_01
path: de/2023_q1_01-*
- split: 2023_q1_02
path: de/2023_q1_02-*
- split: 2023_q1_03
path: de/2023_q1_03-*
- split: 2023_q2_05
path: de/2023_q2_05-*
- split: 2023_q2_06
path: de/2023_q2_06-*
- split: 2023_q3_07
path: de/2023_q3_07-*
- split: 2023_q3_09
path: de/2023_q3_09-*
- split: 2023_q4_10
path: de/2023_q4_10-*
- split: 2023_q4_11
path: de/2023_q4_11-*
- split: 2023_q4_12
path: de/2023_q4_12-*
- split: 2024_q1_01
path: de/2024_q1_01-*
- split: '2004'
path: de/2004_*
- split: '2005'
path: de/2005_*
- split: '2006'
path: de/2006_*
- split: '2007'
path: de/2007_*
- split: '2008'
path: de/2008_*
- split: '2009'
path: de/2009_*
- split: '2010'
path: de/2010_*
- split: '2011'
path: de/2011_*
- split: '2012'
path: de/2012_*
- split: '2013'
path: de/2013_*
- split: '2014'
path: de/2014_*
- split: '2015'
path: de/2015_*
- split: '2016'
path: de/2016_*
- split: '2017'
path: de/2017_*
- split: '2018'
path: de/2018_*
- split: '2019'
path: de/2019_*
- split: '2020'
path: de/2020_*
- split: '2021'
path: de/2021_*
- split: '2022'
path: de/2022_*
- split: '2023'
path: de/2023_*
- split: '2024'
path: de/2024_*
- split: 2005_q2
path: de/2005_q2_*
- split: 2016_q2
path: de/2016_q2_*
- split: 2017_q1
path: de/2017_q1_*
- split: 2010_q4
path: de/2010_q4_*
- split: 2021_q1
path: de/2021_q1_*
- split: 2014_q4
path: de/2014_q4_*
- split: 2004_q3
path: de/2004_q3_*
- split: 2015_q3
path: de/2015_q3_*
- split: 2019_q3
path: de/2019_q3_*
- split: 2005_q4
path: de/2005_q4_*
- split: 2016_q4
path: de/2016_q4_*
- split: 2017_q3
path: de/2017_q3_*
- split: 2021_q3
path: de/2021_q3_*
- split: 2006_q2
path: de/2006_q2_*
- split: 2024_q1
path: de/2024_q1_*
- split: 2011_q1
path: de/2011_q1_*
- split: 2022_q1
path: de/2022_q1_*
- split: 2008_q2
path: de/2008_q2_*
- split: 2012_q2
path: de/2012_q2_*
- split: 2023_q2
path: de/2023_q2_*
- split: 2013_q1
path: de/2013_q1_*
- split: 2006_q4
path: de/2006_q4_*
- split: 2011_q3
path: de/2011_q3_*
- split: 2022_q3
path: de/2022_q3_*
- split: 2008_q4
path: de/2008_q4_*
- split: 2012_q4
path: de/2012_q4_*
- split: 2014_q1
path: de/2014_q1_*
- split: 2013_q3
path: de/2013_q3_*
- split: 2023_q4
path: de/2023_q4_*
- split: 2007_q1
path: de/2007_q1_*
- split: 2018_q1
path: de/2018_q1_*
- split: 2015_q2
path: de/2015_q2_*
- split: 2019_q2
path: de/2019_q2_*
- split: 2009_q1
path: de/2009_q1_*
- split: 2020_q1
path: de/2020_q1_*
- split: 2017_q2
path: de/2017_q2_*
- split: 2007_q3
path: de/2007_q3_*
- split: 2018_q3
path: de/2018_q3_*
- split: 2021_q2
path: de/2021_q2_*
- split: 2004_q4
path: de/2004_q4_*
- split: 2015_q4
path: de/2015_q4_*
- split: 2019_q4
path: de/2019_q4_*
- split: 2009_q3
path: de/2009_q3_*
- split: 2020_q3
path: de/2020_q3_*
- split: 2021_q4
path: de/2021_q4_*
- split: 2010_q1
path: de/2010_q1_*
- split: 2011_q2
path: de/2011_q2_*
- split: 2022_q2
path: de/2022_q2_*
- split: 2005_q1
path: de/2005_q1_*
- split: 2016_q1
path: de/2016_q1_*
- split: 2010_q3
path: de/2010_q3_*
- split: 2013_q2
path: de/2013_q2_*
- split: 2014_q3
path: de/2014_q3_*
- split: 2011_q4
path: de/2011_q4_*
- split: 2022_q4
path: de/2022_q4_*
- split: 2005_q3
path: de/2005_q3_*
- split: 2016_q3
path: de/2016_q3_*
- split: 2013_q4
path: de/2013_q4_*
- split: 2019_q1
path: de/2019_q1_*
- split: 2006_q1
path: de/2006_q1_*
- split: 2007_q2
path: de/2007_q2_*
- split: 2017_q4
path: de/2017_q4_*
- split: 2008_q1
path: de/2008_q1_*
- split: 2018_q2
path: de/2018_q2_*
- split: 2012_q1
path: de/2012_q1_*
- split: 2023_q1
path: de/2023_q1_*
- split: 2006_q3
path: de/2006_q3_*
- split: 2009_q2
path: de/2009_q2_*
- split: 2020_q2
path: de/2020_q2_*
- split: 2007_q4
path: de/2007_q4_*
- split: 2018_q4
path: de/2018_q4_*
- split: 2008_q3
path: de/2008_q3_*
- split: 2012_q3
path: de/2012_q3_*
- split: 2023_q3
path: de/2023_q3_*
- split: 2009_q4
path: de/2009_q4_*
- split: 2020_q4
path: de/2020_q4_*
- split: 2010_q2
path: de/2010_q2_*
- split: 2014_q2
path: de/2014_q2_*
- split: 2015_q1
path: de/2015_q1_*
- config_name: en
data_files:
- split: 2004_q4_11
path: en/2004_q4_11-*
- split: no_date
path: en/no_date-*
- split: 2004_q4_12
path: en/2004_q4_12-*
- split: 2005_q1_01
path: en/2005_q1_01-*
- split: 2007_q1_01
path: en/2007_q1_01-*
- split: 2005_q1_02
path: en/2005_q1_02-*
- split: 2005_q2_04
path: en/2005_q2_04-*
- split: 2015_q3_08
path: en/2015_q3_08-*
- split: 2005_q1_03
path: en/2005_q1_03-*
- split: 2024_q1_03
path: en/2024_q1_03-*
- split: 2024_q2_04
path: en/2024_q2_04-*
- split: 2005_q2_05
path: en/2005_q2_05-*
- split: 2005_q3_09
path: en/2005_q3_09-*
- split: 2005_q2_06
path: en/2005_q2_06-*
- split: 2005_q3_07
path: en/2005_q3_07-*
- split: 2005_q3_08
path: en/2005_q3_08-*
- split: 2006_q4_12
path: en/2006_q4_12-*
- split: 2005_q4_10
path: en/2005_q4_10-*
- split: 2005_q4_11
path: en/2005_q4_11-*
- split: 2005_q4_12
path: en/2005_q4_12-*
- split: 2006_q1_01
path: en/2006_q1_01-*
- split: 2006_q1_03
path: en/2006_q1_03-*
- split: 2006_q1_02
path: en/2006_q1_02-*
- split: 2009_q1_03
path: en/2009_q1_03-*
- split: 2006_q2_04
path: en/2006_q2_04-*
- split: 2006_q2_05
path: en/2006_q2_05-*
- split: 2006_q2_06
path: en/2006_q2_06-*
- split: 2006_q3_07
path: en/2006_q3_07-*
- split: 2006_q3_08
path: en/2006_q3_08-*
- split: 2006_q4_10
path: en/2006_q4_10-*
- split: 2006_q3_09
path: en/2006_q3_09-*
- split: 2006_q4_11
path: en/2006_q4_11-*
- split: 2007_q1_02
path: en/2007_q1_02-*
- split: 2007_q1_03
path: en/2007_q1_03-*
- split: 2007_q2_05
path: en/2007_q2_05-*
- split: 2007_q2_04
path: en/2007_q2_04-*
- split: 2007_q3_08
path: en/2007_q3_08-*
- split: 2011_q4_10
path: en/2011_q4_10-*
- split: 2008_q2_06
path: en/2008_q2_06-*
- split: 2009_q4_11
path: en/2009_q4_11-*
- split: 2007_q4_10
path: en/2007_q4_10-*
- split: 2007_q2_06
path: en/2007_q2_06-*
- split: 2007_q4_11
path: en/2007_q4_11-*
- split: 2007_q3_07
path: en/2007_q3_07-*
- split: 2007_q3_09
path: en/2007_q3_09-*
- split: 2008_q1_01
path: en/2008_q1_01-*
- split: 2007_q4_12
path: en/2007_q4_12-*
- split: 2009_q1_01
path: en/2009_q1_01-*
- split: 2008_q1_02
path: en/2008_q1_02-*
- split: 2008_q1_03
path: en/2008_q1_03-*
- split: 2008_q2_04
path: en/2008_q2_04-*
- split: 2008_q3_08
path: en/2008_q3_08-*
- split: 2008_q2_05
path: en/2008_q2_05-*
- split: 2009_q3_08
path: en/2009_q3_08-*
- split: 2008_q3_07
path: en/2008_q3_07-*
- split: 2008_q3_09
path: en/2008_q3_09-*
- split: 2009_q3_09
path: en/2009_q3_09-*
- split: 2009_q3_07
path: en/2009_q3_07-*
- split: 2008_q4_10
path: en/2008_q4_10-*
- split: 2008_q4_11
path: en/2008_q4_11-*
- split: 2008_q4_12
path: en/2008_q4_12-*
- split: 2009_q1_02
path: en/2009_q1_02-*
- split: 2009_q2_05
path: en/2009_q2_05-*
- split: 2009_q2_04
path: en/2009_q2_04-*
- split: 2009_q2_06
path: en/2009_q2_06-*
- split: 2009_q4_10
path: en/2009_q4_10-*
- split: 2009_q4_12
path: en/2009_q4_12-*
- split: 2010_q1_01
path: en/2010_q1_01-*
- split: 2010_q2_04
path: en/2010_q2_04-*
- split: 2010_q1_02
path: en/2010_q1_02-*
- split: 2010_q1_03
path: en/2010_q1_03-*
- split: 2010_q4_12
path: en/2010_q4_12-*
- split: 2010_q2_05
path: en/2010_q2_05-*
- split: 2010_q3_09
path: en/2010_q3_09-*
- split: 2010_q2_06
path: en/2010_q2_06-*
- split: 2010_q3_07
path: en/2010_q3_07-*
- split: 2010_q3_08
path: en/2010_q3_08-*
- split: 2010_q4_10
path: en/2010_q4_10-*
- split: 2010_q4_11
path: en/2010_q4_11-*
- split: 2011_q4_12
path: en/2011_q4_12-*
- split: 2011_q1_01
path: en/2011_q1_01-*
- split: 2011_q1_02
path: en/2011_q1_02-*
- split: 2011_q1_03
path: en/2011_q1_03-*
- split: 2011_q2_04
path: en/2011_q2_04-*
- split: 2011_q2_05
path: en/2011_q2_05-*
- split: 2011_q2_06
path: en/2011_q2_06-*
- split: 2011_q3_07
path: en/2011_q3_07-*
- split: 2011_q3_08
path: en/2011_q3_08-*
- split: 2011_q3_09
path: en/2011_q3_09-*
- split: 2011_q4_11
path: en/2011_q4_11-*
- split: 2012_q1_01
path: en/2012_q1_01-*
- split: 2012_q2_05
path: en/2012_q2_05-*
- split: 2012_q1_02
path: en/2012_q1_02-*
- split: 2012_q1_03
path: en/2012_q1_03-*
- split: 2012_q2_04
path: en/2012_q2_04-*
- split: 2012_q2_06
path: en/2012_q2_06-*
- split: 2012_q3_07
path: en/2012_q3_07-*
- split: 2012_q3_08
path: en/2012_q3_08-*
- split: 2012_q3_09
path: en/2012_q3_09-*
- split: 2012_q4_10
path: en/2012_q4_10-*
- split: 2012_q4_11
path: en/2012_q4_11-*
- split: 2012_q4_12
path: en/2012_q4_12-*
- split: 2013_q1_02
path: en/2013_q1_02-*
- split: 2013_q1_01
path: en/2013_q1_01-*
- split: 2013_q1_03
path: en/2013_q1_03-*
- split: 2013_q2_04
path: en/2013_q2_04-*
- split: 2013_q2_05
path: en/2013_q2_05-*
- split: 2013_q2_06
path: en/2013_q2_06-*
- split: 2013_q3_07
path: en/2013_q3_07-*
- split: 2013_q3_08
path: en/2013_q3_08-*
- split: 2013_q3_09
path: en/2013_q3_09-*
- split: 2013_q4_10
path: en/2013_q4_10-*
- split: 2013_q4_11
path: en/2013_q4_11-*
- split: 2013_q4_12
path: en/2013_q4_12-*
- split: 2014_q1_01
path: en/2014_q1_01-*
- split: 2014_q1_02
path: en/2014_q1_02-*
- split: 2014_q1_03
path: en/2014_q1_03-*
- split: 2014_q2_04
path: en/2014_q2_04-*
- split: 2014_q2_05
path: en/2014_q2_05-*
- split: 2014_q2_06
path: en/2014_q2_06-*
- split: 2014_q3_07
path: en/2014_q3_07-*
- split: 2014_q3_08
path: en/2014_q3_08-*
- split: 2014_q3_09
path: en/2014_q3_09-*
- split: 2014_q4_11
path: en/2014_q4_11-*
- split: 2014_q4_10
path: en/2014_q4_10-*
- split: 2014_q4_12
path: en/2014_q4_12-*
- split: 2015_q1_01
path: en/2015_q1_01-*
- split: 2015_q1_02
path: en/2015_q1_02-*
- split: 2015_q1_03
path: en/2015_q1_03-*
- split: 2015_q2_04
path: en/2015_q2_04-*
- split: 2015_q2_05
path: en/2015_q2_05-*
- split: 2015_q2_06
path: en/2015_q2_06-*
- split: 2015_q3_07
path: en/2015_q3_07-*
- split: 2015_q3_09
path: en/2015_q3_09-*
- split: 2015_q4_10
path: en/2015_q4_10-*
- split: 2015_q4_11
path: en/2015_q4_11-*
- split: 2015_q4_12
path: en/2015_q4_12-*
- split: 2016_q1_01
path: en/2016_q1_01-*
- split: 2016_q1_02
path: en/2016_q1_02-*
- split: 2016_q1_03
path: en/2016_q1_03-*
- split: 2016_q2_04
path: en/2016_q2_04-*
- split: 2016_q2_05
path: en/2016_q2_05-*
- split: 2016_q2_06
path: en/2016_q2_06-*
- split: 2016_q3_07
path: en/2016_q3_07-*
- split: 2016_q3_08
path: en/2016_q3_08-*
- split: 2016_q3_09
path: en/2016_q3_09-*
- split: 2016_q4_10
path: en/2016_q4_10-*
- split: 2016_q4_11
path: en/2016_q4_11-*
- split: 2016_q4_12
path: en/2016_q4_12-*
- split: 2017_q1_01
path: en/2017_q1_01-*
- split: 2017_q1_02
path: en/2017_q1_02-*
- split: 2017_q1_03
path: en/2017_q1_03-*
- split: 2017_q2_04
path: en/2017_q2_04-*
- split: 2017_q2_05
path: en/2017_q2_05-*
- split: 2018_q1_01
path: en/2018_q1_01-*
- split: 2017_q2_06
path: en/2017_q2_06-*
- split: 2017_q3_07
path: en/2017_q3_07-*
- split: 2017_q3_08
path: en/2017_q3_08-*
- split: 2017_q4_10
path: en/2017_q4_10-*
- split: 2017_q3_09
path: en/2017_q3_09-*
- split: 2017_q4_11
path: en/2017_q4_11-*
- split: 2017_q4_12
path: en/2017_q4_12-*
- split: 2018_q1_02
path: en/2018_q1_02-*
- split: 2018_q1_03
path: en/2018_q1_03-*
- split: 2018_q2_04
path: en/2018_q2_04-*
- split: 2018_q2_05
path: en/2018_q2_05-*
- split: 2018_q2_06
path: en/2018_q2_06-*
- split: 2018_q3_07
path: en/2018_q3_07-*
- split: 2018_q3_08
path: en/2018_q3_08-*
- split: 2018_q3_09
path: en/2018_q3_09-*
- split: 2018_q4_10
path: en/2018_q4_10-*
- split: 2019_q1_01
path: en/2019_q1_01-*
- split: 2018_q4_11
path: en/2018_q4_11-*
- split: 2018_q4_12
path: en/2018_q4_12-*
- split: 2019_q1_02
path: en/2019_q1_02-*
- split: 2019_q1_03
path: en/2019_q1_03-*
- split: 2019_q2_04
path: en/2019_q2_04-*
- split: 2019_q2_05
path: en/2019_q2_05-*
- split: 2019_q2_06
path: en/2019_q2_06-*
- split: 2019_q3_07
path: en/2019_q3_07-*
- split: 2019_q3_08
path: en/2019_q3_08-*
- split: 2019_q3_09
path: en/2019_q3_09-*
- split: 2019_q4_10
path: en/2019_q4_10-*
- split: 2019_q4_11
path: en/2019_q4_11-*
- split: 2019_q4_12
path: en/2019_q4_12-*
- split: 2020_q1_01
path: en/2020_q1_01-*
- split: 2020_q1_02
path: en/2020_q1_02-*
- split: 2020_q1_03
path: en/2020_q1_03-*
- split: 2020_q2_04
path: en/2020_q2_04-*
- split: 2020_q3_08
path: en/2020_q3_08-*
- split: 2020_q2_05
path: en/2020_q2_05-*
- split: 2020_q2_06
path: en/2020_q2_06-*
- split: 2020_q3_07
path: en/2020_q3_07-*
- split: 2020_q3_09
path: en/2020_q3_09-*
- split: 2020_q4_10
path: en/2020_q4_10-*
- split: 2020_q4_12
path: en/2020_q4_12-*
- split: 2020_q4_11
path: en/2020_q4_11-*
- split: 2021_q2_04
path: en/2021_q2_04-*
- split: 2021_q1_01
path: en/2021_q1_01-*
- split: 2021_q1_02
path: en/2021_q1_02-*
- split: 2021_q1_03
path: en/2021_q1_03-*
- split: 2021_q2_05
path: en/2021_q2_05-*
- split: 2021_q2_06
path: en/2021_q2_06-*
- split: 2021_q3_07
path: en/2021_q3_07-*
- split: 2021_q3_08
path: en/2021_q3_08-*
- split: 2021_q3_09
path: en/2021_q3_09-*
- split: 2021_q4_10
path: en/2021_q4_10-*
- split: 2021_q4_11
path: en/2021_q4_11-*
- split: 2022_q1_02
path: en/2022_q1_02-*
- split: 2021_q4_12
path: en/2021_q4_12-*
- split: 2022_q1_01
path: en/2022_q1_01-*
- split: 2022_q1_03
path: en/2022_q1_03-*
- split: 2022_q2_04
path: en/2022_q2_04-*
- split: 2022_q2_05
path: en/2022_q2_05-*
- split: 2022_q2_06
path: en/2022_q2_06-*
- split: 2022_q3_07
path: en/2022_q3_07-*
- split: 2022_q3_08
path: en/2022_q3_08-*
- split: 2022_q3_09
path: en/2022_q3_09-*
- split: 2022_q4_11
path: en/2022_q4_11-*
- split: 2022_q4_10
path: en/2022_q4_10-*
- split: 2022_q4_12
path: en/2022_q4_12-*
- split: 2023_q1_01
path: en/2023_q1_01-*
- split: 2023_q1_02
path: en/2023_q1_02-*
- split: 2023_q1_03
path: en/2023_q1_03-*
- split: 2023_q2_04
path: en/2023_q2_04-*
- split: 2023_q2_05
path: en/2023_q2_05-*
- split: 2023_q2_06
path: en/2023_q2_06-*
- split: 2023_q3_07
path: en/2023_q3_07-*
- split: 2023_q3_08
path: en/2023_q3_08-*
- split: 2023_q3_09
path: en/2023_q3_09-*
- split: 2023_q4_10
path: en/2023_q4_10-*
- split: 2023_q4_12
path: en/2023_q4_12-*
- split: 2023_q4_11
path: en/2023_q4_11-*
- split: 2024_q1_01
path: en/2024_q1_01-*
- split: '2004'
path: en/2004_*
- split: '2005'
path: en/2005_*
- split: '2006'
path: en/2006_*
- split: '2007'
path: en/2007_*
- split: '2008'
path: en/2008_*
- split: '2009'
path: en/2009_*
- split: '2010'
path: en/2010_*
- split: '2011'
path: en/2011_*
- split: '2012'
path: en/2012_*
- split: '2013'
path: en/2013_*
- split: '2014'
path: en/2014_*
- split: '2015'
path: en/2015_*
- split: '2016'
path: en/2016_*
- split: '2017'
path: en/2017_*
- split: '2018'
path: en/2018_*
- split: '2019'
path: en/2019_*
- split: '2020'
path: en/2020_*
- split: '2021'
path: en/2021_*
- split: '2022'
path: en/2022_*
- split: '2023'
path: en/2023_*
- split: '2024'
path: en/2024_*
- split: 2005_q2
path: en/2005_q2_*
- split: 2016_q2
path: en/2016_q2_*
- split: 2017_q1
path: en/2017_q1_*
- split: 2010_q4
path: en/2010_q4_*
- split: 2021_q1
path: en/2021_q1_*
- split: 2014_q4
path: en/2014_q4_*
- split: 2015_q3
path: en/2015_q3_*
- split: 2019_q3
path: en/2019_q3_*
- split: 2005_q4
path: en/2005_q4_*
- split: 2016_q4
path: en/2016_q4_*
- split: 2017_q3
path: en/2017_q3_*
- split: 2021_q3
path: en/2021_q3_*
- split: 2006_q2
path: en/2006_q2_*
- split: 2024_q1
path: en/2024_q1_*
- split: 2011_q1
path: en/2011_q1_*
- split: 2022_q1
path: en/2022_q1_*
- split: 2008_q2
path: en/2008_q2_*
- split: 2012_q2
path: en/2012_q2_*
- split: 2023_q2
path: en/2023_q2_*
- split: 2013_q1
path: en/2013_q1_*
- split: 2006_q4
path: en/2006_q4_*
- split: 2011_q3
path: en/2011_q3_*
- split: 2022_q3
path: en/2022_q3_*
- split: 2008_q4
path: en/2008_q4_*
- split: 2012_q4
path: en/2012_q4_*
- split: 2014_q1
path: en/2014_q1_*
- split: 2013_q3
path: en/2013_q3_*
- split: 2023_q4
path: en/2023_q4_*
- split: 2007_q1
path: en/2007_q1_*
- split: 2018_q1
path: en/2018_q1_*
- split: 2015_q2
path: en/2015_q2_*
- split: 2019_q2
path: en/2019_q2_*
- split: 2009_q1
path: en/2009_q1_*
- split: 2020_q1
path: en/2020_q1_*
- split: 2017_q2
path: en/2017_q2_*
- split: 2007_q3
path: en/2007_q3_*
- split: 2018_q3
path: en/2018_q3_*
- split: 2021_q2
path: en/2021_q2_*
- split: 2004_q4
path: en/2004_q4_*
- split: 2015_q4
path: en/2015_q4_*
- split: 2019_q4
path: en/2019_q4_*
- split: 2009_q3
path: en/2009_q3_*
- split: 2020_q3
path: en/2020_q3_*
- split: 2021_q4
path: en/2021_q4_*
- split: 2010_q1
path: en/2010_q1_*
- split: 2011_q2
path: en/2011_q2_*
- split: 2022_q2
path: en/2022_q2_*
- split: 2005_q1
path: en/2005_q1_*
- split: 2016_q1
path: en/2016_q1_*
- split: 2010_q3
path: en/2010_q3_*
- split: 2013_q2
path: en/2013_q2_*
- split: 2014_q3
path: en/2014_q3_*
- split: 2011_q4
path: en/2011_q4_*
- split: 2022_q4
path: en/2022_q4_*
- split: 2005_q3
path: en/2005_q3_*
- split: 2016_q3
path: en/2016_q3_*
- split: 2013_q4
path: en/2013_q4_*
- split: 2019_q1
path: en/2019_q1_*
- split: 2006_q1
path: en/2006_q1_*
- split: 2007_q2
path: en/2007_q2_*
- split: 2017_q4
path: en/2017_q4_*
- split: 2008_q1
path: en/2008_q1_*
- split: 2018_q2
path: en/2018_q2_*
- split: 2012_q1
path: en/2012_q1_*
- split: 2023_q1
path: en/2023_q1_*
- split: 2006_q3
path: en/2006_q3_*
- split: 2009_q2
path: en/2009_q2_*
- split: 2020_q2
path: en/2020_q2_*
- split: 2024_q2
path: en/2024_q2_*
- split: 2007_q4
path: en/2007_q4_*
- split: 2018_q4
path: en/2018_q4_*
- split: 2008_q3
path: en/2008_q3_*
- split: 2012_q3
path: en/2012_q3_*
- split: 2023_q3
path: en/2023_q3_*
- split: 2009_q4
path: en/2009_q4_*
- split: 2020_q4
path: en/2020_q4_*
- split: 2010_q2
path: en/2010_q2_*
- split: 2014_q2
path: en/2014_q2_*
- split: 2015_q1
path: en/2015_q1_*
- config_name: es
data_files:
- split: 2005_q1_01
path: es/2005_q1_01-*
- split: 2005_q1_02
path: es/2005_q1_02-*
- split: 2004_q1_02
path: es/2004_q1_02-*
- split: 2005_q1_03
path: es/2005_q1_03-*
- split: no_date
path: es/no_date-*
- split: 2005_q2_04
path: es/2005_q2_04-*
- split: 2005_q2_05
path: es/2005_q2_05-*
- split: 2005_q2_06
path: es/2005_q2_06-*
- split: 2005_q3_07
path: es/2005_q3_07-*
- split: 2005_q3_08
path: es/2005_q3_08-*
- split: 2005_q3_09
path: es/2005_q3_09-*
- split: 2005_q4_10
path: es/2005_q4_10-*
- split: 2005_q4_12
path: es/2005_q4_12-*
- split: 2006_q4_10
path: es/2006_q4_10-*
- split: 2005_q4_11
path: es/2005_q4_11-*
- split: 2006_q1_01
path: es/2006_q1_01-*
- split: 2006_q1_02
path: es/2006_q1_02-*
- split: 2006_q1_03
path: es/2006_q1_03-*
- split: 2006_q2_04
path: es/2006_q2_04-*
- split: 2006_q2_05
path: es/2006_q2_05-*
- split: 2006_q2_06
path: es/2006_q2_06-*
- split: 2006_q3_07
path: es/2006_q3_07-*
- split: 2006_q3_08
path: es/2006_q3_08-*
- split: 2006_q3_09
path: es/2006_q3_09-*
- split: 2006_q4_11
path: es/2006_q4_11-*
- split: 2006_q4_12
path: es/2006_q4_12-*
- split: 2007_q1_01
path: es/2007_q1_01-*
- split: 2007_q1_02
path: es/2007_q1_02-*
- split: 2007_q1_03
path: es/2007_q1_03-*
- split: 2007_q2_04
path: es/2007_q2_04-*
- split: 2007_q2_05
path: es/2007_q2_05-*
- split: 2007_q2_06
path: es/2007_q2_06-*
- split: 2007_q3_07
path: es/2007_q3_07-*
- split: 2007_q3_08
path: es/2007_q3_08-*
- split: 2007_q3_09
path: es/2007_q3_09-*
- split: 2007_q4_10
path: es/2007_q4_10-*
- split: 2007_q4_11
path: es/2007_q4_11-*
- split: 2007_q4_12
path: es/2007_q4_12-*
- split: 2008_q1_01
path: es/2008_q1_01-*
- split: 2008_q1_02
path: es/2008_q1_02-*
- split: 2008_q1_03
path: es/2008_q1_03-*
- split: 2008_q2_04
path: es/2008_q2_04-*
- split: 2008_q2_05
path: es/2008_q2_05-*
- split: 2008_q2_06
path: es/2008_q2_06-*
- split: 2008_q3_07
path: es/2008_q3_07-*
- split: 2008_q3_08
path: es/2008_q3_08-*
- split: 2008_q3_09
path: es/2008_q3_09-*
- split: 2008_q4_10
path: es/2008_q4_10-*
- split: 2008_q4_11
path: es/2008_q4_11-*
- split: 2008_q4_12
path: es/2008_q4_12-*
- split: 2009_q1_01
path: es/2009_q1_01-*
- split: 2009_q1_02
path: es/2009_q1_02-*
- split: 2009_q1_03
path: es/2009_q1_03-*
- split: 2009_q2_04
path: es/2009_q2_04-*
- split: 2009_q2_05
path: es/2009_q2_05-*
- split: 2009_q2_06
path: es/2009_q2_06-*
- split: 2009_q3_07
path: es/2009_q3_07-*
- split: 2009_q3_08
path: es/2009_q3_08-*
- split: 2009_q3_09
path: es/2009_q3_09-*
- split: 2009_q4_10
path: es/2009_q4_10-*
- split: 2009_q4_11
path: es/2009_q4_11-*
- split: 2009_q4_12
path: es/2009_q4_12-*
- split: 2010_q1_01
path: es/2010_q1_01-*
- split: 2010_q1_02
path: es/2010_q1_02-*
- split: 2010_q1_03
path: es/2010_q1_03-*
- split: 2011_q1_02
path: es/2011_q1_02-*
- split: 2010_q2_04
path: es/2010_q2_04-*
- split: 2010_q2_05
path: es/2010_q2_05-*
- split: 2010_q2_06
path: es/2010_q2_06-*
- split: 2010_q3_07
path: es/2010_q3_07-*
- split: 2010_q3_08
path: es/2010_q3_08-*
- split: 2010_q3_09
path: es/2010_q3_09-*
- split: 2010_q4_10
path: es/2010_q4_10-*
- split: 2010_q4_11
path: es/2010_q4_11-*
- split: 2010_q4_12
path: es/2010_q4_12-*
- split: 2011_q1_01
path: es/2011_q1_01-*
- split: 2013_q2_04
path: es/2013_q2_04-*
- split: 2011_q1_03
path: es/2011_q1_03-*
- split: 2011_q2_04
path: es/2011_q2_04-*
- split: 2011_q2_05
path: es/2011_q2_05-*
- split: 2011_q2_06
path: es/2011_q2_06-*
- split: 2011_q3_07
path: es/2011_q3_07-*
- split: 2011_q3_08
path: es/2011_q3_08-*
- split: 2011_q3_09
path: es/2011_q3_09-*
- split: 2011_q4_10
path: es/2011_q4_10-*
- split: 2011_q4_11
path: es/2011_q4_11-*
- split: 2011_q4_12
path: es/2011_q4_12-*
- split: 2012_q1_01
path: es/2012_q1_01-*
- split: 2012_q1_02
path: es/2012_q1_02-*
- split: 2012_q1_03
path: es/2012_q1_03-*
- split: 2012_q2_04
path: es/2012_q2_04-*
- split: 2012_q2_05
path: es/2012_q2_05-*
- split: 2012_q2_06
path: es/2012_q2_06-*
- split: 2012_q3_07
path: es/2012_q3_07-*
- split: 2012_q3_08
path: es/2012_q3_08-*
- split: 2012_q3_09
path: es/2012_q3_09-*
- split: 2012_q4_10
path: es/2012_q4_10-*
- split: 2012_q4_11
path: es/2012_q4_11-*
- split: 2012_q4_12
path: es/2012_q4_12-*
- split: 2013_q1_01
path: es/2013_q1_01-*
- split: 2013_q1_02
path: es/2013_q1_02-*
- split: 2013_q1_03
path: es/2013_q1_03-*
- split: 2013_q2_05
path: es/2013_q2_05-*
- split: 2013_q2_06
path: es/2013_q2_06-*
- split: 2013_q3_07
path: es/2013_q3_07-*
- split: 2013_q3_08
path: es/2013_q3_08-*
- split: 2013_q3_09
path: es/2013_q3_09-*
- split: 2013_q4_11
path: es/2013_q4_11-*
- split: 2013_q4_10
path: es/2013_q4_10-*
- split: 2013_q4_12
path: es/2013_q4_12-*
- split: 2014_q1_01
path: es/2014_q1_01-*
- split: 2014_q1_02
path: es/2014_q1_02-*
- split: 2024_q1_02
path: es/2024_q1_02-*
- split: 2014_q1_03
path: es/2014_q1_03-*
- split: 2014_q2_04
path: es/2014_q2_04-*
- split: 2014_q2_05
path: es/2014_q2_05-*
- split: 2014_q2_06
path: es/2014_q2_06-*
- split: 2014_q3_07
path: es/2014_q3_07-*
- split: 2014_q3_08
path: es/2014_q3_08-*
- split: 2014_q3_09
path: es/2014_q3_09-*
- split: 2014_q4_10
path: es/2014_q4_10-*
- split: 2014_q4_11
path: es/2014_q4_11-*
- split: 2014_q4_12
path: es/2014_q4_12-*
- split: 2015_q1_01
path: es/2015_q1_01-*
- split: 2015_q1_02
path: es/2015_q1_02-*
- split: 2015_q1_03
path: es/2015_q1_03-*
- split: 2015_q2_04
path: es/2015_q2_04-*
- split: 2015_q2_05
path: es/2015_q2_05-*
- split: 2015_q2_06
path: es/2015_q2_06-*
- split: 2015_q3_07
path: es/2015_q3_07-*
- split: 2015_q3_08
path: es/2015_q3_08-*
- split: 2015_q3_09
path: es/2015_q3_09-*
- split: 2015_q4_10
path: es/2015_q4_10-*
- split: 2015_q4_11
path: es/2015_q4_11-*
- split: 2015_q4_12
path: es/2015_q4_12-*
- split: 2016_q1_01
path: es/2016_q1_01-*
- split: 2016_q1_02
path: es/2016_q1_02-*
- split: 2016_q1_03
path: es/2016_q1_03-*
- split: 2016_q2_04
path: es/2016_q2_04-*
- split: 2016_q2_05
path: es/2016_q2_05-*
- split: 2016_q2_06
path: es/2016_q2_06-*
- split: 2016_q3_07
path: es/2016_q3_07-*
- split: 2016_q3_08
path: es/2016_q3_08-*
- split: 2016_q3_09
path: es/2016_q3_09-*
- split: 2016_q4_10
path: es/2016_q4_10-*
- split: 2016_q4_11
path: es/2016_q4_11-*
- split: 2016_q4_12
path: es/2016_q4_12-*
- split: 2017_q1_01
path: es/2017_q1_01-*
- split: 2017_q1_02
path: es/2017_q1_02-*
- split: 2017_q1_03
path: es/2017_q1_03-*
- split: 2017_q2_04
path: es/2017_q2_04-*
- split: 2017_q2_05
path: es/2017_q2_05-*
- split: 2017_q2_06
path: es/2017_q2_06-*
- split: 2017_q3_07
path: es/2017_q3_07-*
- split: 2017_q3_08
path: es/2017_q3_08-*
- split: 2017_q3_09
path: es/2017_q3_09-*
- split: 2017_q4_10
path: es/2017_q4_10-*
- split: 2017_q4_11
path: es/2017_q4_11-*
- split: 2017_q4_12
path: es/2017_q4_12-*
- split: 2018_q1_01
path: es/2018_q1_01-*
- split: 2018_q1_02
path: es/2018_q1_02-*
- split: 2018_q1_03
path: es/2018_q1_03-*
- split: 2018_q2_04
path: es/2018_q2_04-*
- split: 2018_q2_05
path: es/2018_q2_05-*
- split: 2018_q2_06
path: es/2018_q2_06-*
- split: 2018_q3_07
path: es/2018_q3_07-*
- split: 2018_q3_08
path: es/2018_q3_08-*
- split: 2018_q3_09
path: es/2018_q3_09-*
- split: 2018_q4_10
path: es/2018_q4_10-*
- split: 2018_q4_11
path: es/2018_q4_11-*
- split: 2018_q4_12
path: es/2018_q4_12-*
- split: 2019_q1_01
path: es/2019_q1_01-*
- split: 2019_q1_02
path: es/2019_q1_02-*
- split: 2019_q1_03
path: es/2019_q1_03-*
- split: 2019_q2_04
path: es/2019_q2_04-*
- split: 2019_q2_05
path: es/2019_q2_05-*
- split: 2019_q2_06
path: es/2019_q2_06-*
- split: 2019_q3_07
path: es/2019_q3_07-*
- split: 2019_q3_08
path: es/2019_q3_08-*
- split: 2019_q3_09
path: es/2019_q3_09-*
- split: 2019_q4_10
path: es/2019_q4_10-*
- split: 2019_q4_11
path: es/2019_q4_11-*
- split: 2019_q4_12
path: es/2019_q4_12-*
- split: 2020_q1_01
path: es/2020_q1_01-*
- split: 2020_q1_02
path: es/2020_q1_02-*
- split: 2020_q1_03
path: es/2020_q1_03-*
- split: 2020_q2_04
path: es/2020_q2_04-*
- split: 2020_q2_05
path: es/2020_q2_05-*
- split: 2020_q2_06
path: es/2020_q2_06-*
- split: 2020_q3_07
path: es/2020_q3_07-*
- split: 2020_q3_08
path: es/2020_q3_08-*
- split: 2020_q3_09
path: es/2020_q3_09-*
- split: 2020_q4_10
path: es/2020_q4_10-*
- split: 2020_q4_11
path: es/2020_q4_11-*
- split: 2020_q4_12
path: es/2020_q4_12-*
- split: 2021_q1_01
path: es/2021_q1_01-*
- split: 2021_q1_02
path: es/2021_q1_02-*
- split: 2021_q1_03
path: es/2021_q1_03-*
- split: 2021_q2_04
path: es/2021_q2_04-*
- split: 2021_q2_05
path: es/2021_q2_05-*
- split: 2021_q2_06
path: es/2021_q2_06-*
- split: 2021_q3_07
path: es/2021_q3_07-*
- split: 2021_q3_08
path: es/2021_q3_08-*
- split: 2021_q3_09
path: es/2021_q3_09-*
- split: 2021_q4_10
path: es/2021_q4_10-*
- split: 2021_q4_11
path: es/2021_q4_11-*
- split: 2021_q4_12
path: es/2021_q4_12-*
- split: 2022_q1_01
path: es/2022_q1_01-*
- split: 2022_q1_02
path: es/2022_q1_02-*
- split: 2022_q1_03
path: es/2022_q1_03-*
- split: 2022_q2_04
path: es/2022_q2_04-*
- split: 2022_q2_05
path: es/2022_q2_05-*
- split: 2022_q2_06
path: es/2022_q2_06-*
- split: 2022_q3_07
path: es/2022_q3_07-*
- split: 2022_q3_08
path: es/2022_q3_08-*
- split: 2022_q3_09
path: es/2022_q3_09-*
- split: 2022_q4_10
path: es/2022_q4_10-*
- split: 2022_q4_11
path: es/2022_q4_11-*
- split: 2022_q4_12
path: es/2022_q4_12-*
- split: 2023_q1_01
path: es/2023_q1_01-*
- split: 2023_q1_02
path: es/2023_q1_02-*
- split: 2023_q1_03
path: es/2023_q1_03-*
- split: 2023_q2_04
path: es/2023_q2_04-*
- split: 2023_q2_05
path: es/2023_q2_05-*
- split: 2023_q2_06
path: es/2023_q2_06-*
- split: 2023_q3_07
path: es/2023_q3_07-*
- split: 2023_q3_08
path: es/2023_q3_08-*
- split: 2023_q3_09
path: es/2023_q3_09-*
- split: 2023_q4_10
path: es/2023_q4_10-*
- split: 2023_q4_11
path: es/2023_q4_11-*
- split: 2023_q4_12
path: es/2023_q4_12-*
- split: 2024_q1_01
path: es/2024_q1_01-*
- split: '2004'
path: es/2004_*
- split: '2005'
path: es/2005_*
- split: '2006'
path: es/2006_*
- split: '2007'
path: es/2007_*
- split: '2008'
path: es/2008_*
- split: '2009'
path: es/2009_*
- split: '2010'
path: es/2010_*
- split: '2011'
path: es/2011_*
- split: '2012'
path: es/2012_*
- split: '2013'
path: es/2013_*
- split: '2014'
path: es/2014_*
- split: '2015'
path: es/2015_*
- split: '2016'
path: es/2016_*
- split: '2017'
path: es/2017_*
- split: '2018'
path: es/2018_*
- split: '2019'
path: es/2019_*
- split: '2020'
path: es/2020_*
- split: '2021'
path: es/2021_*
- split: '2022'
path: es/2022_*
- split: '2023'
path: es/2023_*
- split: '2024'
path: es/2024_*
- split: 2005_q2
path: es/2005_q2_*
- split: 2016_q2
path: es/2016_q2_*
- split: 2017_q1
path: es/2017_q1_*
- split: 2010_q4
path: es/2010_q4_*
- split: 2021_q1
path: es/2021_q1_*
- split: 2014_q4
path: es/2014_q4_*
- split: 2015_q3
path: es/2015_q3_*
- split: 2019_q3
path: es/2019_q3_*
- split: 2005_q4
path: es/2005_q4_*
- split: 2016_q4
path: es/2016_q4_*
- split: 2017_q3
path: es/2017_q3_*
- split: 2021_q3
path: es/2021_q3_*
- split: 2006_q2
path: es/2006_q2_*
- split: 2024_q1
path: es/2024_q1_*
- split: 2011_q1
path: es/2011_q1_*
- split: 2022_q1
path: es/2022_q1_*
- split: 2008_q2
path: es/2008_q2_*
- split: 2012_q2
path: es/2012_q2_*
- split: 2023_q2
path: es/2023_q2_*
- split: 2013_q1
path: es/2013_q1_*
- split: 2006_q4
path: es/2006_q4_*
- split: 2011_q3
path: es/2011_q3_*
- split: 2022_q3
path: es/2022_q3_*
- split: 2008_q4
path: es/2008_q4_*
- split: 2012_q4
path: es/2012_q4_*
- split: 2014_q1
path: es/2014_q1_*
- split: 2013_q3
path: es/2013_q3_*
- split: 2023_q4
path: es/2023_q4_*
- split: 2007_q1
path: es/2007_q1_*
- split: 2018_q1
path: es/2018_q1_*
- split: 2015_q2
path: es/2015_q2_*
- split: 2019_q2
path: es/2019_q2_*
- split: 2009_q1
path: es/2009_q1_*
- split: 2020_q1
path: es/2020_q1_*
- split: 2017_q2
path: es/2017_q2_*
- split: 2007_q3
path: es/2007_q3_*
- split: 2018_q3
path: es/2018_q3_*
- split: 2021_q2
path: es/2021_q2_*
- split: 2015_q4
path: es/2015_q4_*
- split: 2019_q4
path: es/2019_q4_*
- split: 2009_q3
path: es/2009_q3_*
- split: 2020_q3
path: es/2020_q3_*
- split: 2021_q4
path: es/2021_q4_*
- split: 2010_q1
path: es/2010_q1_*
- split: 2011_q2
path: es/2011_q2_*
- split: 2022_q2
path: es/2022_q2_*
- split: 2005_q1
path: es/2005_q1_*
- split: 2016_q1
path: es/2016_q1_*
- split: 2010_q3
path: es/2010_q3_*
- split: 2013_q2
path: es/2013_q2_*
- split: 2014_q3
path: es/2014_q3_*
- split: 2011_q4
path: es/2011_q4_*
- split: 2022_q4
path: es/2022_q4_*
- split: 2005_q3
path: es/2005_q3_*
- split: 2016_q3
path: es/2016_q3_*
- split: 2013_q4
path: es/2013_q4_*
- split: 2019_q1
path: es/2019_q1_*
- split: 2006_q1
path: es/2006_q1_*
- split: 2004_q1
path: es/2004_q1_*
- split: 2007_q2
path: es/2007_q2_*
- split: 2017_q4
path: es/2017_q4_*
- split: 2008_q1
path: es/2008_q1_*
- split: 2018_q2
path: es/2018_q2_*
- split: 2012_q1
path: es/2012_q1_*
- split: 2023_q1
path: es/2023_q1_*
- split: 2006_q3
path: es/2006_q3_*
- split: 2009_q2
path: es/2009_q2_*
- split: 2020_q2
path: es/2020_q2_*
- split: 2007_q4
path: es/2007_q4_*
- split: 2018_q4
path: es/2018_q4_*
- split: 2008_q3
path: es/2008_q3_*
- split: 2012_q3
path: es/2012_q3_*
- split: 2023_q3
path: es/2023_q3_*
- split: 2009_q4
path: es/2009_q4_*
- split: 2020_q4
path: es/2020_q4_*
- split: 2010_q2
path: es/2010_q2_*
- split: 2014_q2
path: es/2014_q2_*
- split: 2015_q1
path: es/2015_q1_*
- config_name: fr
data_files:
- split: 2005_q1_01
path: fr/2005_q1_01-*
- split: 2005_q1_02
path: fr/2005_q1_02-*
- split: 2005_q1_03
path: fr/2005_q1_03-*
- split: 2005_q2_04
path: fr/2005_q2_04-*
- split: 2005_q2_05
path: fr/2005_q2_05-*
- split: 2005_q2_06
path: fr/2005_q2_06-*
- split: 2005_q3_07
path: fr/2005_q3_07-*
- split: 2005_q3_08
path: fr/2005_q3_08-*
- split: 2005_q3_09
path: fr/2005_q3_09-*
- split: 2005_q4_10
path: fr/2005_q4_10-*
- split: 2005_q4_11
path: fr/2005_q4_11-*
- split: 2005_q4_12
path: fr/2005_q4_12-*
- split: 2006_q1_01
path: fr/2006_q1_01-*
- split: 2006_q1_02
path: fr/2006_q1_02-*
- split: 2006_q1_03
path: fr/2006_q1_03-*
- split: 2006_q2_04
path: fr/2006_q2_04-*
- split: 2006_q2_05
path: fr/2006_q2_05-*
- split: 2006_q2_06
path: fr/2006_q2_06-*
- split: 2006_q3_07
path: fr/2006_q3_07-*
- split: 2006_q3_08
path: fr/2006_q3_08-*
- split: 2006_q3_09
path: fr/2006_q3_09-*
- split: 2006_q4_10
path: fr/2006_q4_10-*
- split: 2006_q4_11
path: fr/2006_q4_11-*
- split: 2006_q4_12
path: fr/2006_q4_12-*
- split: 2007_q1_01
path: fr/2007_q1_01-*
- split: 2007_q1_02
path: fr/2007_q1_02-*
- split: 2007_q1_03
path: fr/2007_q1_03-*
- split: 2007_q2_04
path: fr/2007_q2_04-*
- split: 2007_q2_05
path: fr/2007_q2_05-*
- split: no_date
path: fr/no_date-*
- split: 2007_q2_06
path: fr/2007_q2_06-*
- split: 2007_q3_07
path: fr/2007_q3_07-*
- split: 2007_q3_08
path: fr/2007_q3_08-*
- split: 2007_q3_09
path: fr/2007_q3_09-*
- split: 2007_q4_10
path: fr/2007_q4_10-*
- split: 2007_q4_11
path: fr/2007_q4_11-*
- split: 2007_q4_12
path: fr/2007_q4_12-*
- split: 2008_q1_01
path: fr/2008_q1_01-*
- split: 2008_q1_02
path: fr/2008_q1_02-*
- split: 2008_q1_03
path: fr/2008_q1_03-*
- split: 2008_q2_04
path: fr/2008_q2_04-*
- split: 2008_q2_05
path: fr/2008_q2_05-*
- split: 2008_q2_06
path: fr/2008_q2_06-*
- split: 2008_q3_07
path: fr/2008_q3_07-*
- split: 2008_q3_08
path: fr/2008_q3_08-*
- split: 2008_q3_09
path: fr/2008_q3_09-*
- split: 2008_q4_10
path: fr/2008_q4_10-*
- split: 2008_q4_11
path: fr/2008_q4_11-*
- split: 2008_q4_12
path: fr/2008_q4_12-*
- split: 2009_q1_01
path: fr/2009_q1_01-*
- split: 2009_q1_02
path: fr/2009_q1_02-*
- split: 2009_q1_03
path: fr/2009_q1_03-*
- split: 2009_q2_04
path: fr/2009_q2_04-*
- split: 2009_q2_05
path: fr/2009_q2_05-*
- split: 2009_q2_06
path: fr/2009_q2_06-*
- split: 2009_q3_07
path: fr/2009_q3_07-*
- split: 2009_q3_08
path: fr/2009_q3_08-*
- split: 2009_q3_09
path: fr/2009_q3_09-*
- split: 2011_q2_04
path: fr/2011_q2_04-*
- split: 2009_q4_10
path: fr/2009_q4_10-*
- split: 2009_q4_11
path: fr/2009_q4_11-*
- split: 2009_q4_12
path: fr/2009_q4_12-*
- split: 2010_q1_01
path: fr/2010_q1_01-*
- split: 2010_q1_02
path: fr/2010_q1_02-*
- split: 2010_q1_03
path: fr/2010_q1_03-*
- split: 2010_q2_04
path: fr/2010_q2_04-*
- split: 2010_q2_05
path: fr/2010_q2_05-*
- split: 2010_q2_06
path: fr/2010_q2_06-*
- split: 2010_q3_07
path: fr/2010_q3_07-*
- split: 2010_q3_08
path: fr/2010_q3_08-*
- split: 2010_q3_09
path: fr/2010_q3_09-*
- split: 2010_q4_10
path: fr/2010_q4_10-*
- split: 2010_q4_11
path: fr/2010_q4_11-*
- split: 2010_q4_12
path: fr/2010_q4_12-*
- split: 2011_q1_01
path: fr/2011_q1_01-*
- split: 2011_q1_02
path: fr/2011_q1_02-*
- split: 2011_q1_03
path: fr/2011_q1_03-*
- split: 2011_q2_05
path: fr/2011_q2_05-*
- split: 2011_q2_06
path: fr/2011_q2_06-*
- split: 2011_q3_07
path: fr/2011_q3_07-*
- split: 2011_q3_08
path: fr/2011_q3_08-*
- split: 2011_q3_09
path: fr/2011_q3_09-*
- split: 2011_q4_10
path: fr/2011_q4_10-*
- split: 2011_q4_11
path: fr/2011_q4_11-*
- split: 2011_q4_12
path: fr/2011_q4_12-*
- split: 2012_q1_01
path: fr/2012_q1_01-*
- split: 2012_q1_02
path: fr/2012_q1_02-*
- split: 2012_q1_03
path: fr/2012_q1_03-*
- split: 2012_q2_04
path: fr/2012_q2_04-*
- split: 2012_q2_05
path: fr/2012_q2_05-*
- split: 2012_q2_06
path: fr/2012_q2_06-*
- split: 2012_q3_07
path: fr/2012_q3_07-*
- split: 2012_q3_08
path: fr/2012_q3_08-*
- split: 2024_q2_04
path: fr/2024_q2_04-*
- split: 2012_q3_09
path: fr/2012_q3_09-*
- split: 2012_q4_10
path: fr/2012_q4_10-*
- split: 2012_q4_11
path: fr/2012_q4_11-*
- split: 2012_q4_12
path: fr/2012_q4_12-*
- split: 2013_q1_01
path: fr/2013_q1_01-*
- split: 2013_q1_02
path: fr/2013_q1_02-*
- split: 2013_q1_03
path: fr/2013_q1_03-*
- split: 2013_q2_04
path: fr/2013_q2_04-*
- split: 2013_q2_05
path: fr/2013_q2_05-*
- split: 2013_q2_06
path: fr/2013_q2_06-*
- split: 2013_q3_07
path: fr/2013_q3_07-*
- split: 2013_q3_08
path: fr/2013_q3_08-*
- split: 2013_q3_09
path: fr/2013_q3_09-*
- split: 2013_q4_10
path: fr/2013_q4_10-*
- split: 2013_q4_11
path: fr/2013_q4_11-*
- split: 2013_q4_12
path: fr/2013_q4_12-*
- split: 2014_q1_01
path: fr/2014_q1_01-*
- split: 2014_q1_02
path: fr/2014_q1_02-*
- split: 2014_q1_03
path: fr/2014_q1_03-*
- split: 2024_q1_02
path: fr/2024_q1_02-*
- split: 2014_q2_04
path: fr/2014_q2_04-*
- split: 2014_q2_05
path: fr/2014_q2_05-*
- split: 2014_q2_06
path: fr/2014_q2_06-*
- split: 2014_q3_07
path: fr/2014_q3_07-*
- split: 2014_q3_08
path: fr/2014_q3_08-*
- split: 2014_q3_09
path: fr/2014_q3_09-*
- split: 2014_q4_10
path: fr/2014_q4_10-*
- split: 2014_q4_11
path: fr/2014_q4_11-*
- split: 2014_q4_12
path: fr/2014_q4_12-*
- split: 2015_q1_01
path: fr/2015_q1_01-*
- split: 2015_q1_02
path: fr/2015_q1_02-*
- split: 2015_q1_03
path: fr/2015_q1_03-*
- split: 2015_q3_09
path: fr/2015_q3_09-*
- split: 2015_q2_04
path: fr/2015_q2_04-*
- split: 2015_q2_05
path: fr/2015_q2_05-*
- split: 2015_q2_06
path: fr/2015_q2_06-*
- split: 2016_q3_08
path: fr/2016_q3_08-*
- split: 2015_q3_07
path: fr/2015_q3_07-*
- split: 2015_q3_08
path: fr/2015_q3_08-*
- split: 2015_q4_10
path: fr/2015_q4_10-*
- split: 2015_q4_11
path: fr/2015_q4_11-*
- split: 2015_q4_12
path: fr/2015_q4_12-*
- split: 2016_q1_01
path: fr/2016_q1_01-*
- split: 2016_q1_02
path: fr/2016_q1_02-*
- split: 2016_q1_03
path: fr/2016_q1_03-*
- split: 2016_q2_04
path: fr/2016_q2_04-*
- split: 2016_q2_05
path: fr/2016_q2_05-*
- split: 2016_q2_06
path: fr/2016_q2_06-*
- split: 2016_q3_07
path: fr/2016_q3_07-*
- split: 2016_q3_09
path: fr/2016_q3_09-*
- split: 2020_q1_01
path: fr/2020_q1_01-*
- split: 2016_q4_10
path: fr/2016_q4_10-*
- split: 2016_q4_11
path: fr/2016_q4_11-*
- split: 2016_q4_12
path: fr/2016_q4_12-*
- split: 2017_q1_01
path: fr/2017_q1_01-*
- split: 2017_q1_02
path: fr/2017_q1_02-*
- split: 2017_q1_03
path: fr/2017_q1_03-*
- split: 2017_q2_04
path: fr/2017_q2_04-*
- split: 2017_q2_05
path: fr/2017_q2_05-*
- split: 2017_q2_06
path: fr/2017_q2_06-*
- split: 2017_q3_07
path: fr/2017_q3_07-*
- split: 2017_q3_08
path: fr/2017_q3_08-*
- split: 2017_q3_09
path: fr/2017_q3_09-*
- split: 2017_q4_10
path: fr/2017_q4_10-*
- split: 2017_q4_11
path: fr/2017_q4_11-*
- split: 2017_q4_12
path: fr/2017_q4_12-*
- split: 2018_q1_01
path: fr/2018_q1_01-*
- split: 2018_q1_02
path: fr/2018_q1_02-*
- split: 2018_q1_03
path: fr/2018_q1_03-*
- split: 2018_q2_04
path: fr/2018_q2_04-*
- split: 2018_q2_05
path: fr/2018_q2_05-*
- split: 2018_q2_06
path: fr/2018_q2_06-*
- split: 2018_q3_07
path: fr/2018_q3_07-*
- split: 2018_q3_08
path: fr/2018_q3_08-*
- split: 2018_q3_09
path: fr/2018_q3_09-*
- split: 2018_q4_10
path: fr/2018_q4_10-*
- split: 2018_q4_11
path: fr/2018_q4_11-*
- split: 2018_q4_12
path: fr/2018_q4_12-*
- split: 2019_q1_01
path: fr/2019_q1_01-*
- split: 2019_q1_02
path: fr/2019_q1_02-*
- split: 2019_q1_03
path: fr/2019_q1_03-*
- split: 2019_q2_04
path: fr/2019_q2_04-*
- split: 2019_q2_05
path: fr/2019_q2_05-*
- split: 2019_q2_06
path: fr/2019_q2_06-*
- split: 2019_q3_07
path: fr/2019_q3_07-*
- split: 2019_q3_08
path: fr/2019_q3_08-*
- split: 2019_q3_09
path: fr/2019_q3_09-*
- split: 2019_q4_10
path: fr/2019_q4_10-*
- split: 2019_q4_11
path: fr/2019_q4_11-*
- split: 2019_q4_12
path: fr/2019_q4_12-*
- split: 2020_q1_02
path: fr/2020_q1_02-*
- split: 2020_q1_03
path: fr/2020_q1_03-*
- split: 2020_q3_09
path: fr/2020_q3_09-*
- split: 2020_q2_04
path: fr/2020_q2_04-*
- split: 2020_q2_05
path: fr/2020_q2_05-*
- split: 2020_q2_06
path: fr/2020_q2_06-*
- split: 2020_q3_07
path: fr/2020_q3_07-*
- split: 2020_q3_08
path: fr/2020_q3_08-*
- split: 2020_q4_10
path: fr/2020_q4_10-*
- split: 2020_q4_11
path: fr/2020_q4_11-*
- split: 2020_q4_12
path: fr/2020_q4_12-*
- split: 2021_q1_01
path: fr/2021_q1_01-*
- split: 2021_q1_02
path: fr/2021_q1_02-*
- split: 2021_q1_03
path: fr/2021_q1_03-*
- split: 2021_q2_04
path: fr/2021_q2_04-*
- split: 2021_q2_05
path: fr/2021_q2_05-*
- split: 2021_q2_06
path: fr/2021_q2_06-*
- split: 2021_q3_07
path: fr/2021_q3_07-*
- split: 2021_q3_08
path: fr/2021_q3_08-*
- split: 2021_q3_09
path: fr/2021_q3_09-*
- split: 2021_q4_10
path: fr/2021_q4_10-*
- split: 2021_q4_11
path: fr/2021_q4_11-*
- split: 2021_q4_12
path: fr/2021_q4_12-*
- split: 2022_q1_01
path: fr/2022_q1_01-*
- split: 2022_q1_02
path: fr/2022_q1_02-*
- split: 2022_q1_03
path: fr/2022_q1_03-*
- split: 2022_q2_05
path: fr/2022_q2_05-*
- split: 2022_q2_04
path: fr/2022_q2_04-*
- split: 2022_q2_06
path: fr/2022_q2_06-*
- split: 2022_q3_07
path: fr/2022_q3_07-*
- split: 2022_q3_08
path: fr/2022_q3_08-*
- split: 2022_q3_09
path: fr/2022_q3_09-*
- split: 2022_q4_12
path: fr/2022_q4_12-*
- split: 2022_q4_10
path: fr/2022_q4_10-*
- split: 2022_q4_11
path: fr/2022_q4_11-*
- split: 2023_q1_01
path: fr/2023_q1_01-*
- split: 2023_q1_03
path: fr/2023_q1_03-*
- split: 2023_q1_02
path: fr/2023_q1_02-*
- split: 2023_q2_04
path: fr/2023_q2_04-*
- split: 2023_q2_05
path: fr/2023_q2_05-*
- split: 2023_q2_06
path: fr/2023_q2_06-*
- split: 2023_q3_07
path: fr/2023_q3_07-*
- split: 2023_q3_08
path: fr/2023_q3_08-*
- split: 2023_q3_09
path: fr/2023_q3_09-*
- split: 2023_q4_10
path: fr/2023_q4_10-*
- split: 2023_q4_11
path: fr/2023_q4_11-*
- split: 2023_q4_12
path: fr/2023_q4_12-*
- split: 2024_q1_01
path: fr/2024_q1_01-*
- split: '2005'
path: fr/2005_*
- split: '2006'
path: fr/2006_*
- split: '2007'
path: fr/2007_*
- split: '2008'
path: fr/2008_*
- split: '2009'
path: fr/2009_*
- split: '2010'
path: fr/2010_*
- split: '2011'
path: fr/2011_*
- split: '2012'
path: fr/2012_*
- split: '2013'
path: fr/2013_*
- split: '2014'
path: fr/2014_*
- split: '2015'
path: fr/2015_*
- split: '2016'
path: fr/2016_*
- split: '2017'
path: fr/2017_*
- split: '2018'
path: fr/2018_*
- split: '2019'
path: fr/2019_*
- split: '2020'
path: fr/2020_*
- split: '2021'
path: fr/2021_*
- split: '2022'
path: fr/2022_*
- split: '2023'
path: fr/2023_*
- split: '2024'
path: fr/2024_*
- split: 2005_q2
path: fr/2005_q2_*
- split: 2016_q2
path: fr/2016_q2_*
- split: 2017_q1
path: fr/2017_q1_*
- split: 2010_q4
path: fr/2010_q4_*
- split: 2021_q1
path: fr/2021_q1_*
- split: 2014_q4
path: fr/2014_q4_*
- split: 2015_q3
path: fr/2015_q3_*
- split: 2019_q3
path: fr/2019_q3_*
- split: 2005_q4
path: fr/2005_q4_*
- split: 2016_q4
path: fr/2016_q4_*
- split: 2017_q3
path: fr/2017_q3_*
- split: 2021_q3
path: fr/2021_q3_*
- split: 2006_q2
path: fr/2006_q2_*
- split: 2024_q1
path: fr/2024_q1_*
- split: 2011_q1
path: fr/2011_q1_*
- split: 2022_q1
path: fr/2022_q1_*
- split: 2008_q2
path: fr/2008_q2_*
- split: 2012_q2
path: fr/2012_q2_*
- split: 2023_q2
path: fr/2023_q2_*
- split: 2013_q1
path: fr/2013_q1_*
- split: 2006_q4
path: fr/2006_q4_*
- split: 2011_q3
path: fr/2011_q3_*
- split: 2022_q3
path: fr/2022_q3_*
- split: 2008_q4
path: fr/2008_q4_*
- split: 2012_q4
path: fr/2012_q4_*
- split: 2014_q1
path: fr/2014_q1_*
- split: 2013_q3
path: fr/2013_q3_*
- split: 2023_q4
path: fr/2023_q4_*
- split: 2007_q1
path: fr/2007_q1_*
- split: 2018_q1
path: fr/2018_q1_*
- split: 2015_q2
path: fr/2015_q2_*
- split: 2019_q2
path: fr/2019_q2_*
- split: 2009_q1
path: fr/2009_q1_*
- split: 2020_q1
path: fr/2020_q1_*
- split: 2017_q2
path: fr/2017_q2_*
- split: 2007_q3
path: fr/2007_q3_*
- split: 2018_q3
path: fr/2018_q3_*
- split: 2021_q2
path: fr/2021_q2_*
- split: 2015_q4
path: fr/2015_q4_*
- split: 2019_q4
path: fr/2019_q4_*
- split: 2009_q3
path: fr/2009_q3_*
- split: 2020_q3
path: fr/2020_q3_*
- split: 2021_q4
path: fr/2021_q4_*
- split: 2010_q1
path: fr/2010_q1_*
- split: 2011_q2
path: fr/2011_q2_*
- split: 2022_q2
path: fr/2022_q2_*
- split: 2005_q1
path: fr/2005_q1_*
- split: 2016_q1
path: fr/2016_q1_*
- split: 2010_q3
path: fr/2010_q3_*
- split: 2013_q2
path: fr/2013_q2_*
- split: 2014_q3
path: fr/2014_q3_*
- split: 2011_q4
path: fr/2011_q4_*
- split: 2022_q4
path: fr/2022_q4_*
- split: 2005_q3
path: fr/2005_q3_*
- split: 2016_q3
path: fr/2016_q3_*
- split: 2013_q4
path: fr/2013_q4_*
- split: 2019_q1
path: fr/2019_q1_*
- split: 2006_q1
path: fr/2006_q1_*
- split: 2007_q2
path: fr/2007_q2_*
- split: 2017_q4
path: fr/2017_q4_*
- split: 2008_q1
path: fr/2008_q1_*
- split: 2018_q2
path: fr/2018_q2_*
- split: 2012_q1
path: fr/2012_q1_*
- split: 2023_q1
path: fr/2023_q1_*
- split: 2006_q3
path: fr/2006_q3_*
- split: 2009_q2
path: fr/2009_q2_*
- split: 2020_q2
path: fr/2020_q2_*
- split: 2024_q2
path: fr/2024_q2_*
- split: 2007_q4
path: fr/2007_q4_*
- split: 2018_q4
path: fr/2018_q4_*
- split: 2008_q3
path: fr/2008_q3_*
- split: 2012_q3
path: fr/2012_q3_*
- split: 2023_q3
path: fr/2023_q3_*
- split: 2009_q4
path: fr/2009_q4_*
- split: 2020_q4
path: fr/2020_q4_*
- split: 2010_q2
path: fr/2010_q2_*
- split: 2014_q2
path: fr/2014_q2_*
- split: 2015_q1
path: fr/2015_q1_*
- config_name: it
data_files:
- split: 2005_q1_03
path: it/2005_q1_03-*
- split: 2005_q2_04
path: it/2005_q2_04-*
- split: no_date
path: it/no_date-*
- split: 2005_q2_05
path: it/2005_q2_05-*
- split: 2006_q1_02
path: it/2006_q1_02-*
- split: 2005_q2_06
path: it/2005_q2_06-*
- split: 2005_q3_07
path: it/2005_q3_07-*
- split: 2005_q3_09
path: it/2005_q3_09-*
- split: 2005_q4_10
path: it/2005_q4_10-*
- split: 2005_q3_08
path: it/2005_q3_08-*
- split: 2005_q4_11
path: it/2005_q4_11-*
- split: 2005_q4_12
path: it/2005_q4_12-*
- split: 2006_q1_01
path: it/2006_q1_01-*
- split: 2006_q2_04
path: it/2006_q2_04-*
- split: 2006_q2_05
path: it/2006_q2_05-*
- split: 2005_q1_01
path: it/2005_q1_01-*
- split: 2006_q1_03
path: it/2006_q1_03-*
- split: 2007_q3_08
path: it/2007_q3_08-*
- split: 2007_q4_10
path: it/2007_q4_10-*
- split: 2006_q2_06
path: it/2006_q2_06-*
- split: 2006_q3_09
path: it/2006_q3_09-*
- split: 2006_q4_10
path: it/2006_q4_10-*
- split: 2006_q4_11
path: it/2006_q4_11-*
- split: 2006_q4_12
path: it/2006_q4_12-*
- split: 2007_q1_01
path: it/2007_q1_01-*
- split: 2007_q1_02
path: it/2007_q1_02-*
- split: 2007_q1_03
path: it/2007_q1_03-*
- split: 2007_q2_04
path: it/2007_q2_04-*
- split: 2006_q3_07
path: it/2006_q3_07-*
- split: 2007_q2_05
path: it/2007_q2_05-*
- split: 2007_q2_06
path: it/2007_q2_06-*
- split: 2006_q3_08
path: it/2006_q3_08-*
- split: 2007_q3_07
path: it/2007_q3_07-*
- split: 2007_q3_09
path: it/2007_q3_09-*
- split: 2008_q1_03
path: it/2008_q1_03-*
- split: 2008_q1_02
path: it/2008_q1_02-*
- split: 2007_q4_11
path: it/2007_q4_11-*
- split: 2007_q4_12
path: it/2007_q4_12-*
- split: 2008_q1_01
path: it/2008_q1_01-*
- split: 2008_q2_06
path: it/2008_q2_06-*
- split: 2008_q2_04
path: it/2008_q2_04-*
- split: 2008_q2_05
path: it/2008_q2_05-*
- split: 2008_q3_07
path: it/2008_q3_07-*
- split: 2009_q3_08
path: it/2009_q3_08-*
- split: 2008_q3_08
path: it/2008_q3_08-*
- split: 2008_q3_09
path: it/2008_q3_09-*
- split: 2008_q4_10
path: it/2008_q4_10-*
- split: 2008_q4_11
path: it/2008_q4_11-*
- split: 2008_q4_12
path: it/2008_q4_12-*
- split: 2009_q1_01
path: it/2009_q1_01-*
- split: 2009_q1_03
path: it/2009_q1_03-*
- split: 2009_q1_02
path: it/2009_q1_02-*
- split: 2012_q3_08
path: it/2012_q3_08-*
- split: 2009_q2_04
path: it/2009_q2_04-*
- split: 2009_q2_05
path: it/2009_q2_05-*
- split: 2009_q2_06
path: it/2009_q2_06-*
- split: 2009_q3_07
path: it/2009_q3_07-*
- split: 2009_q3_09
path: it/2009_q3_09-*
- split: 2009_q4_12
path: it/2009_q4_12-*
- split: 2009_q4_10
path: it/2009_q4_10-*
- split: 2009_q4_11
path: it/2009_q4_11-*
- split: 2010_q1_01
path: it/2010_q1_01-*
- split: 2010_q1_02
path: it/2010_q1_02-*
- split: 2010_q1_03
path: it/2010_q1_03-*
- split: 2010_q2_04
path: it/2010_q2_04-*
- split: 2010_q2_05
path: it/2010_q2_05-*
- split: 2010_q2_06
path: it/2010_q2_06-*
- split: 2010_q3_07
path: it/2010_q3_07-*
- split: 2010_q3_08
path: it/2010_q3_08-*
- split: 2010_q3_09
path: it/2010_q3_09-*
- split: 2010_q4_10
path: it/2010_q4_10-*
- split: 2010_q4_11
path: it/2010_q4_11-*
- split: 2010_q4_12
path: it/2010_q4_12-*
- split: 2011_q1_01
path: it/2011_q1_01-*
- split: 2011_q1_02
path: it/2011_q1_02-*
- split: 2011_q1_03
path: it/2011_q1_03-*
- split: 2011_q2_04
path: it/2011_q2_04-*
- split: 2011_q2_05
path: it/2011_q2_05-*
- split: 2011_q2_06
path: it/2011_q2_06-*
- split: 2011_q3_07
path: it/2011_q3_07-*
- split: 2011_q3_08
path: it/2011_q3_08-*
- split: 2011_q3_09
path: it/2011_q3_09-*
- split: 2011_q4_10
path: it/2011_q4_10-*
- split: 2011_q4_11
path: it/2011_q4_11-*
- split: 2011_q4_12
path: it/2011_q4_12-*
- split: 2012_q1_01
path: it/2012_q1_01-*
- split: 2012_q1_02
path: it/2012_q1_02-*
- split: 2012_q1_03
path: it/2012_q1_03-*
- split: 2012_q2_04
path: it/2012_q2_04-*
- split: 2012_q2_05
path: it/2012_q2_05-*
- split: 2012_q2_06
path: it/2012_q2_06-*
- split: 2012_q3_07
path: it/2012_q3_07-*
- split: 2012_q3_09
path: it/2012_q3_09-*
- split: 2012_q4_10
path: it/2012_q4_10-*
- split: 2012_q4_11
path: it/2012_q4_11-*
- split: 2012_q4_12
path: it/2012_q4_12-*
- split: 2013_q1_01
path: it/2013_q1_01-*
- split: 2013_q1_02
path: it/2013_q1_02-*
- split: 2013_q1_03
path: it/2013_q1_03-*
- split: 2013_q2_04
path: it/2013_q2_04-*
- split: 2013_q2_05
path: it/2013_q2_05-*
- split: 2013_q2_06
path: it/2013_q2_06-*
- split: 2013_q3_07
path: it/2013_q3_07-*
- split: 2013_q3_08
path: it/2013_q3_08-*
- split: 2013_q3_09
path: it/2013_q3_09-*
- split: 2013_q4_10
path: it/2013_q4_10-*
- split: 2013_q4_11
path: it/2013_q4_11-*
- split: 2013_q4_12
path: it/2013_q4_12-*
- split: 2014_q1_01
path: it/2014_q1_01-*
- split: 2014_q1_02
path: it/2014_q1_02-*
- split: 2014_q1_03
path: it/2014_q1_03-*
- split: 2014_q2_04
path: it/2014_q2_04-*
- split: 2014_q2_05
path: it/2014_q2_05-*
- split: 2014_q2_06
path: it/2014_q2_06-*
- split: 2014_q3_07
path: it/2014_q3_07-*
- split: 2014_q4_12
path: it/2014_q4_12-*
- split: 2014_q3_08
path: it/2014_q3_08-*
- split: 2014_q3_09
path: it/2014_q3_09-*
- split: 2014_q4_10
path: it/2014_q4_10-*
- split: 2014_q4_11
path: it/2014_q4_11-*
- split: 2015_q1_01
path: it/2015_q1_01-*
- split: 2015_q1_02
path: it/2015_q1_02-*
- split: 2015_q1_03
path: it/2015_q1_03-*
- split: 2015_q2_04
path: it/2015_q2_04-*
- split: 2015_q2_05
path: it/2015_q2_05-*
- split: 2015_q2_06
path: it/2015_q2_06-*
- split: 2015_q3_07
path: it/2015_q3_07-*
- split: 2016_q1_02
path: it/2016_q1_02-*
- split: 2015_q3_08
path: it/2015_q3_08-*
- split: 2015_q3_09
path: it/2015_q3_09-*
- split: 2015_q4_10
path: it/2015_q4_10-*
- split: 2015_q4_11
path: it/2015_q4_11-*
- split: 2015_q4_12
path: it/2015_q4_12-*
- split: 2016_q1_01
path: it/2016_q1_01-*
- split: 2016_q3_09
path: it/2016_q3_09-*
- split: 2016_q1_03
path: it/2016_q1_03-*
- split: 2016_q2_04
path: it/2016_q2_04-*
- split: 2016_q2_05
path: it/2016_q2_05-*
- split: 2016_q2_06
path: it/2016_q2_06-*
- split: 2016_q3_07
path: it/2016_q3_07-*
- split: 2016_q3_08
path: it/2016_q3_08-*
- split: 2018_q4_12
path: it/2018_q4_12-*
- split: 2016_q4_10
path: it/2016_q4_10-*
- split: 2016_q4_11
path: it/2016_q4_11-*
- split: 2016_q4_12
path: it/2016_q4_12-*
- split: 2017_q1_01
path: it/2017_q1_01-*
- split: 2017_q1_02
path: it/2017_q1_02-*
- split: 2017_q1_03
path: it/2017_q1_03-*
- split: 2017_q2_04
path: it/2017_q2_04-*
- split: 2017_q2_05
path: it/2017_q2_05-*
- split: 2017_q2_06
path: it/2017_q2_06-*
- split: 2017_q3_07
path: it/2017_q3_07-*
- split: 2017_q3_08
path: it/2017_q3_08-*
- split: 2017_q3_09
path: it/2017_q3_09-*
- split: 2017_q4_10
path: it/2017_q4_10-*
- split: 2017_q4_11
path: it/2017_q4_11-*
- split: 2017_q4_12
path: it/2017_q4_12-*
- split: 2018_q1_01
path: it/2018_q1_01-*
- split: 2018_q1_02
path: it/2018_q1_02-*
- split: 2018_q1_03
path: it/2018_q1_03-*
- split: 2018_q2_04
path: it/2018_q2_04-*
- split: 2018_q2_05
path: it/2018_q2_05-*
- split: 2018_q2_06
path: it/2018_q2_06-*
- split: 2018_q3_08
path: it/2018_q3_08-*
- split: 2018_q3_09
path: it/2018_q3_09-*
- split: 2018_q3_07
path: it/2018_q3_07-*
- split: 2018_q4_10
path: it/2018_q4_10-*
- split: 2018_q4_11
path: it/2018_q4_11-*
- split: 2019_q1_01
path: it/2019_q1_01-*
- split: 2019_q1_02
path: it/2019_q1_02-*
- split: 2019_q1_03
path: it/2019_q1_03-*
- split: 2019_q2_04
path: it/2019_q2_04-*
- split: 2019_q2_05
path: it/2019_q2_05-*
- split: 2019_q2_06
path: it/2019_q2_06-*
- split: 2019_q3_07
path: it/2019_q3_07-*
- split: 2019_q3_08
path: it/2019_q3_08-*
- split: 2019_q3_09
path: it/2019_q3_09-*
- split: 2019_q4_10
path: it/2019_q4_10-*
- split: 2019_q4_11
path: it/2019_q4_11-*
- split: 2019_q4_12
path: it/2019_q4_12-*
- split: 2020_q1_01
path: it/2020_q1_01-*
- split: 2020_q1_02
path: it/2020_q1_02-*
- split: 2020_q1_03
path: it/2020_q1_03-*
- split: 2020_q2_04
path: it/2020_q2_04-*
- split: 2020_q2_05
path: it/2020_q2_05-*
- split: 2020_q2_06
path: it/2020_q2_06-*
- split: 2020_q3_07
path: it/2020_q3_07-*
- split: 2020_q3_08
path: it/2020_q3_08-*
- split: 2020_q3_09
path: it/2020_q3_09-*
- split: 2020_q4_10
path: it/2020_q4_10-*
- split: 2020_q4_11
path: it/2020_q4_11-*
- split: 2020_q4_12
path: it/2020_q4_12-*
- split: 2021_q1_01
path: it/2021_q1_01-*
- split: 2021_q1_02
path: it/2021_q1_02-*
- split: 2021_q1_03
path: it/2021_q1_03-*
- split: 2021_q2_04
path: it/2021_q2_04-*
- split: 2021_q2_05
path: it/2021_q2_05-*
- split: 2021_q2_06
path: it/2021_q2_06-*
- split: 2021_q3_07
path: it/2021_q3_07-*
- split: 2021_q3_08
path: it/2021_q3_08-*
- split: 2021_q3_09
path: it/2021_q3_09-*
- split: 2021_q4_10
path: it/2021_q4_10-*
- split: 2021_q4_11
path: it/2021_q4_11-*
- split: 2021_q4_12
path: it/2021_q4_12-*
- split: 2022_q1_01
path: it/2022_q1_01-*
- split: 2022_q1_02
path: it/2022_q1_02-*
- split: 2022_q1_03
path: it/2022_q1_03-*
- split: 2022_q2_04
path: it/2022_q2_04-*
- split: 2022_q2_05
path: it/2022_q2_05-*
- split: 2022_q2_06
path: it/2022_q2_06-*
- split: 2022_q3_07
path: it/2022_q3_07-*
- split: 2022_q3_08
path: it/2022_q3_08-*
- split: 2022_q3_09
path: it/2022_q3_09-*
- split: 2022_q4_10
path: it/2022_q4_10-*
- split: 2022_q4_11
path: it/2022_q4_11-*
- split: 2022_q4_12
path: it/2022_q4_12-*
- split: 2023_q1_01
path: it/2023_q1_01-*
- split: 2023_q1_02
path: it/2023_q1_02-*
- split: 2023_q1_03
path: it/2023_q1_03-*
- split: 2023_q2_04
path: it/2023_q2_04-*
- split: 2023_q2_05
path: it/2023_q2_05-*
- split: 2023_q2_06
path: it/2023_q2_06-*
- split: 2023_q3_07
path: it/2023_q3_07-*
- split: 2023_q3_08
path: it/2023_q3_08-*
- split: 2023_q3_09
path: it/2023_q3_09-*
- split: 2023_q4_10
path: it/2023_q4_10-*
- split: 2023_q4_11
path: it/2023_q4_11-*
- split: 2023_q4_12
path: it/2023_q4_12-*
- split: 2024_q1_01
path: it/2024_q1_01-*
- split: '2005'
path: it/2005_*
- split: '2006'
path: it/2006_*
- split: '2007'
path: it/2007_*
- split: '2008'
path: it/2008_*
- split: '2009'
path: it/2009_*
- split: '2010'
path: it/2010_*
- split: '2011'
path: it/2011_*
- split: '2012'
path: it/2012_*
- split: '2013'
path: it/2013_*
- split: '2014'
path: it/2014_*
- split: '2015'
path: it/2015_*
- split: '2016'
path: it/2016_*
- split: '2017'
path: it/2017_*
- split: '2018'
path: it/2018_*
- split: '2019'
path: it/2019_*
- split: '2020'
path: it/2020_*
- split: '2021'
path: it/2021_*
- split: '2022'
path: it/2022_*
- split: '2023'
path: it/2023_*
- split: '2024'
path: it/2024_*
- split: 2005_q2
path: it/2005_q2_*
- split: 2016_q2
path: it/2016_q2_*
- split: 2017_q1
path: it/2017_q1_*
- split: 2010_q4
path: it/2010_q4_*
- split: 2021_q1
path: it/2021_q1_*
- split: 2014_q4
path: it/2014_q4_*
- split: 2015_q3
path: it/2015_q3_*
- split: 2019_q3
path: it/2019_q3_*
- split: 2005_q4
path: it/2005_q4_*
- split: 2016_q4
path: it/2016_q4_*
- split: 2017_q3
path: it/2017_q3_*
- split: 2021_q3
path: it/2021_q3_*
- split: 2006_q2
path: it/2006_q2_*
- split: 2024_q1
path: it/2024_q1_*
- split: 2011_q1
path: it/2011_q1_*
- split: 2022_q1
path: it/2022_q1_*
- split: 2008_q2
path: it/2008_q2_*
- split: 2012_q2
path: it/2012_q2_*
- split: 2023_q2
path: it/2023_q2_*
- split: 2013_q1
path: it/2013_q1_*
- split: 2006_q4
path: it/2006_q4_*
- split: 2011_q3
path: it/2011_q3_*
- split: 2022_q3
path: it/2022_q3_*
- split: 2008_q4
path: it/2008_q4_*
- split: 2012_q4
path: it/2012_q4_*
- split: 2014_q1
path: it/2014_q1_*
- split: 2013_q3
path: it/2013_q3_*
- split: 2023_q4
path: it/2023_q4_*
- split: 2007_q1
path: it/2007_q1_*
- split: 2018_q1
path: it/2018_q1_*
- split: 2015_q2
path: it/2015_q2_*
- split: 2019_q2
path: it/2019_q2_*
- split: 2009_q1
path: it/2009_q1_*
- split: 2020_q1
path: it/2020_q1_*
- split: 2017_q2
path: it/2017_q2_*
- split: 2007_q3
path: it/2007_q3_*
- split: 2018_q3
path: it/2018_q3_*
- split: 2021_q2
path: it/2021_q2_*
- split: 2015_q4
path: it/2015_q4_*
- split: 2019_q4
path: it/2019_q4_*
- split: 2009_q3
path: it/2009_q3_*
- split: 2020_q3
path: it/2020_q3_*
- split: 2021_q4
path: it/2021_q4_*
- split: 2010_q1
path: it/2010_q1_*
- split: 2011_q2
path: it/2011_q2_*
- split: 2022_q2
path: it/2022_q2_*
- split: 2005_q1
path: it/2005_q1_*
- split: 2016_q1
path: it/2016_q1_*
- split: 2010_q3
path: it/2010_q3_*
- split: 2013_q2
path: it/2013_q2_*
- split: 2014_q3
path: it/2014_q3_*
- split: 2011_q4
path: it/2011_q4_*
- split: 2022_q4
path: it/2022_q4_*
- split: 2005_q3
path: it/2005_q3_*
- split: 2016_q3
path: it/2016_q3_*
- split: 2013_q4
path: it/2013_q4_*
- split: 2019_q1
path: it/2019_q1_*
- split: 2006_q1
path: it/2006_q1_*
- split: 2007_q2
path: it/2007_q2_*
- split: 2017_q4
path: it/2017_q4_*
- split: 2008_q1
path: it/2008_q1_*
- split: 2018_q2
path: it/2018_q2_*
- split: 2012_q1
path: it/2012_q1_*
- split: 2023_q1
path: it/2023_q1_*
- split: 2006_q3
path: it/2006_q3_*
- split: 2009_q2
path: it/2009_q2_*
- split: 2020_q2
path: it/2020_q2_*
- split: 2007_q4
path: it/2007_q4_*
- split: 2018_q4
path: it/2018_q4_*
- split: 2008_q3
path: it/2008_q3_*
- split: 2012_q3
path: it/2012_q3_*
- split: 2023_q3
path: it/2023_q3_*
- split: 2009_q4
path: it/2009_q4_*
- split: 2020_q4
path: it/2020_q4_*
- split: 2010_q2
path: it/2010_q2_*
- split: 2014_q2
path: it/2014_q2_*
- split: 2015_q1
path: it/2015_q1_*
---
# Wikinews
The dataset contains news articles from Wikinews in different languages.
Each article is associated with metadata like title, url, and date.
The articles grouped into data splits by the article month, quarter, and year (the date is one mentioned in the article text, changes might have been after, see revision timestamp).
The dataset config name defines the language.
## Usage
```python
from datasets import load_dataset
# all English news from 2008
ds = load_dataset("malteos/wikinews", config_name="en", split="2008")
# all German news from January 2017
ds = load_dataset("malteos/wikinews", config_name="de", split="2017_q1_01")
```
## Languages
- en
- es
- fr
- it
- de
## License
All text created after September 25, 2005 available under the terms of the [Creative Commons Attribution 2.5 License](https://creativecommons.org/licenses/by/2.5/).
|
mask-distilled-one-sec-cv12/chunk_240 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1453185512
num_examples: 285386
download_size: 1484150414
dataset_size: 1453185512
---
# Dataset Card for "chunk_240"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
linges0103/datasetfortrain | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 42812
num_examples: 59
download_size: 22507
dataset_size: 42812
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "datasetfortrain"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
abdoutony207/en_ar_dt | ---
license: other
---
|
lazaroq11/billqa | ---
dataset_info:
features:
- name: text
dtype: string
- name: additional_info
dtype: string
splits:
- name: train
num_bytes: 240602641
num_examples: 9846
download_size: 9341153
dataset_size: 240602641
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "billqa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Technoculture/MedpromptCoT | ---
dataset_info:
features:
- name: question
dtype: string
- name: options
dtype: string
- name: reasoning
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 506301
num_examples: 676
download_size: 287262
dataset_size: 506301
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
language:
- en
---
# Dataset Card for "MedpromptCoT"
### Model used: gpt-3.5-turbo
## Dataset Mixture used for generating CoT
| Dataset Name | Original Size(Rows) | Rows used |
|----------------------------------------------------|---------------|-------|
| openlifescienceai/medmcqa | 183k | 0.5k |
| GBaker/MedQA-USMLE-4-options | 10.2k | 0.5k |
Total Size: 1k
Size after selecting only correct CoT: 0.676k
## How to Run the Script:
To rerun the script, follow these steps:
1. Environment Setup:
- Ensure Python environment with required dependencies (specified in requirements.txt) is set up.
- Install necessary libraries using pip install -r requirements.txt.
2. Arguments
- Choose the desired language model using the --model argument.
- If using OpenAI's models (gpt-3.5-turbo or gpt-4-turbo-preview), set the OPENAI_API_KEY environment variable. For using Together models, set the TOGETHER_API_KEY and TOGETHER_API_BASE environment variables.
- Specify the LLM client(openai, together or huggingface) using the --llm_client_type argument.
- Specify the dataset name via the --dataset argument.
- Define the output file location via the --output_file argument.
- Specify the Hugging Face dataset repository for uploading the new dataset conisting of the COTs using the --huggingface_repo argument.
- Specify the checkpointing number using the --checkpointing_steps.
- Specify the count of examples to be taken from the dataset using --count.
Run the script using python generate_medprompt_cot.py [arguments].
### For full details please go through the generate_medprompt_cot.py file. |
tyzhu/squad_qa_baseline_v5_full_recite_ans_sent_last_permute_rerun | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 2996506.0
num_examples: 2385
- name: validation
num_bytes: 395889
num_examples: 300
download_size: 842977
dataset_size: 3392395.0
---
# Dataset Card for "squad_qa_baseline_v5_full_recite_ans_sent_last_permute_rerun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_172 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1072367324.0
num_examples: 208957
download_size: 1089947138
dataset_size: 1072367324.0
---
# Dataset Card for "chunk_172"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ChaoticNeutrals__Prima-LelantaclesV7-experimental-7b | ---
pretty_name: Evaluation run of ChaoticNeutrals/Prima-LelantaclesV7-experimental-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ChaoticNeutrals/Prima-LelantaclesV7-experimental-7b](https://huggingface.co/ChaoticNeutrals/Prima-LelantaclesV7-experimental-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChaoticNeutrals__Prima-LelantaclesV7-experimental-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-14T07:48:27.172404](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__Prima-LelantaclesV7-experimental-7b/blob/main/results_2024-03-14T07-48-27.172404.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6495664452929546,\n\
\ \"acc_stderr\": 0.03218441389606728,\n \"acc_norm\": 0.6487168204006772,\n\
\ \"acc_norm_stderr\": 0.03286122397355617,\n \"mc1\": 0.5826193390452876,\n\
\ \"mc1_stderr\": 0.017262891063272164,\n \"mc2\": 0.7461832399602438,\n\
\ \"mc2_stderr\": 0.01424474030426709\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.013284525292403511,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7162915753833897,\n\
\ \"acc_stderr\": 0.0044987571944934,\n \"acc_norm\": 0.8871738697470624,\n\
\ \"acc_norm_stderr\": 0.0031573355082588415\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933712,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933712\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n\
\ \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n\
\ \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n\
\ \"acc_stderr\": 0.032400380867927465,\n \"acc_norm\": 0.5659574468085107,\n\
\ \"acc_norm_stderr\": 0.032400380867927465\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.0470070803355104,\n\
\ \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.0470070803355104\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"\
acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473086,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473086\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926924,\n\
\ \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926924\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464085,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464085\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n\
\ \"acc_stderr\": 0.016611393687268584,\n \"acc_norm\": 0.4424581005586592,\n\
\ \"acc_norm_stderr\": 0.016611393687268584\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460842,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460842\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.46479791395045633,\n \"acc_stderr\": 0.012738547371303956,\n\
\ \"acc_norm\": 0.46479791395045633,\n \"acc_norm_stderr\": 0.012738547371303956\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n \"\
acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5826193390452876,\n\
\ \"mc1_stderr\": 0.017262891063272164,\n \"mc2\": 0.7461832399602438,\n\
\ \"mc2_stderr\": 0.01424474030426709\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.0100992082460656\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6929492039423806,\n \
\ \"acc_stderr\": 0.012705685723131712\n }\n}\n```"
repo_url: https://huggingface.co/ChaoticNeutrals/Prima-LelantaclesV7-experimental-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|arc:challenge|25_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|gsm8k|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hellaswag|10_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T07-48-27.172404.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T07-48-27.172404.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- '**/details_harness|winogrande|5_2024-03-14T07-48-27.172404.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-14T07-48-27.172404.parquet'
- config_name: results
data_files:
- split: 2024_03_14T07_48_27.172404
path:
- results_2024-03-14T07-48-27.172404.parquet
- split: latest
path:
- results_2024-03-14T07-48-27.172404.parquet
---
# Dataset Card for Evaluation run of ChaoticNeutrals/Prima-LelantaclesV7-experimental-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ChaoticNeutrals/Prima-LelantaclesV7-experimental-7b](https://huggingface.co/ChaoticNeutrals/Prima-LelantaclesV7-experimental-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ChaoticNeutrals__Prima-LelantaclesV7-experimental-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-14T07:48:27.172404](https://huggingface.co/datasets/open-llm-leaderboard/details_ChaoticNeutrals__Prima-LelantaclesV7-experimental-7b/blob/main/results_2024-03-14T07-48-27.172404.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6495664452929546,
"acc_stderr": 0.03218441389606728,
"acc_norm": 0.6487168204006772,
"acc_norm_stderr": 0.03286122397355617,
"mc1": 0.5826193390452876,
"mc1_stderr": 0.017262891063272164,
"mc2": 0.7461832399602438,
"mc2_stderr": 0.01424474030426709
},
"harness|arc:challenge|25": {
"acc": 0.7081911262798635,
"acc_stderr": 0.013284525292403511,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545796
},
"harness|hellaswag|10": {
"acc": 0.7162915753833897,
"acc_stderr": 0.0044987571944934,
"acc_norm": 0.8871738697470624,
"acc_norm_stderr": 0.0031573355082588415
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933712,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933712
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473086,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473086
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926924,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926924
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464085,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464085
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4424581005586592,
"acc_stderr": 0.016611393687268584,
"acc_norm": 0.4424581005586592,
"acc_norm_stderr": 0.016611393687268584
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460842,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303956,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303956
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5826193390452876,
"mc1_stderr": 0.017262891063272164,
"mc2": 0.7461832399602438,
"mc2_stderr": 0.01424474030426709
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.0100992082460656
},
"harness|gsm8k|5": {
"acc": 0.6929492039423806,
"acc_stderr": 0.012705685723131712
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Sunbird/salt | ---
dataset_info:
- config_name: multispeaker-ach
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: audio
sequence: float32
- name: audio_language
dtype: string
- name: is_studio
dtype: bool
- name: speaker_id
dtype: string
- name: sample_rate
dtype: int64
splits:
- name: train
num_bytes: 1789773755
num_examples: 4811
- name: dev
num_bytes: 37429640
num_examples: 101
- name: test
num_bytes: 36224395
num_examples: 96
download_size: 861112801
dataset_size: 1863427790
- config_name: multispeaker-eng
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: audio
sequence: float32
- name: audio_language
dtype: string
- name: is_studio
dtype: bool
- name: speaker_id
dtype: string
- name: sample_rate
dtype: int64
splits:
- name: train
num_bytes: 1490684144
num_examples: 4797
- name: dev
num_bytes: 30879913
num_examples: 100
- name: test
num_bytes: 32136197
num_examples: 96
download_size: 746376946
dataset_size: 1553700254
- config_name: multispeaker-lgg
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: audio
sequence: float32
- name: audio_language
dtype: string
- name: is_studio
dtype: bool
- name: speaker_id
dtype: string
- name: sample_rate
dtype: int64
splits:
- name: train
num_bytes: 2346309650
num_examples: 4768
- name: dev
num_bytes: 49044863
num_examples: 101
- name: test
num_bytes: 49347397
num_examples: 96
download_size: 1191834787
dataset_size: 2444701910
- config_name: multispeaker-lug
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: audio
sequence: float32
- name: audio_language
dtype: string
- name: is_studio
dtype: bool
- name: speaker_id
dtype: string
- name: sample_rate
dtype: int64
splits:
- name: train
num_bytes: 2000647332
num_examples: 5016
- name: dev
num_bytes: 38741382
num_examples: 103
- name: test
num_bytes: 39746716
num_examples: 97
download_size: 1010619540
dataset_size: 2079135430
- config_name: multispeaker-nyn
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: audio
sequence: float32
- name: audio_language
dtype: string
- name: is_studio
dtype: bool
- name: speaker_id
dtype: string
- name: sample_rate
dtype: int64
splits:
- name: train
num_bytes: 2097997736
num_examples: 4812
- name: dev
num_bytes: 42040138
num_examples: 101
- name: test
num_bytes: 45063129
num_examples: 96
download_size: 1426293640
dataset_size: 2185101003
- config_name: multispeaker-teo
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: audio
sequence: float32
- name: audio_language
dtype: string
- name: is_studio
dtype: bool
- name: speaker_id
dtype: string
- name: sample_rate
dtype: int64
splits:
- name: train
num_bytes: 1980187546
num_examples: 4839
- name: dev
num_bytes: 38906909
num_examples: 99
- name: test
num_bytes: 40474249
num_examples: 96
download_size: 992185148
dataset_size: 2059568704
- config_name: studio-acholi
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: audio
sequence: float32
- name: audio_language
dtype: string
- name: is_studio
dtype: bool
- name: speaker_id
dtype: string
- name: sample_rate
dtype: int64
splits:
- name: train
num_bytes: 1347658634
num_examples: 4801
- name: dev
num_bytes: 27757030
num_examples: 101
- name: test
num_bytes: 26447325
num_examples: 96
download_size: 698234854
dataset_size: 1401862989
- config_name: studio-ateso
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: audio
sequence: float32
- name: audio_language
dtype: string
- name: is_studio
dtype: bool
- name: speaker_id
dtype: string
- name: sample_rate
dtype: int64
splits:
- name: train
num_bytes: 2308097503
num_examples: 4564
- name: dev
num_bytes: 49170958
num_examples: 96
- name: test
num_bytes: 47400438
num_examples: 92
download_size: 977293946
dataset_size: 2404668899
- config_name: studio-english
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: audio
sequence: float32
- name: audio_language
dtype: string
- name: is_studio
dtype: bool
- name: speaker_id
dtype: string
- name: sample_rate
dtype: int64
splits:
- name: train
num_bytes: 851109381
num_examples: 2411
- name: dev
num_bytes: 17784430
num_examples: 50
- name: test
num_bytes: 15322757
num_examples: 42
download_size: 435775221
dataset_size: 884216568
- config_name: studio-luganda
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: audio
sequence: float32
- name: audio_language
dtype: string
- name: is_studio
dtype: bool
- name: speaker_id
dtype: string
- name: sample_rate
dtype: int64
splits:
- name: train
num_bytes: 880656730
num_examples: 2395
- name: dev
num_bytes: 18853020
num_examples: 50
- name: test
num_bytes: 16076901
num_examples: 43
download_size: 455441369
dataset_size: 915586651
- config_name: studio-runyankole
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: audio
sequence: float32
- name: audio_language
dtype: string
- name: is_studio
dtype: bool
- name: speaker_id
dtype: string
- name: sample_rate
dtype: int64
splits:
- name: train
num_bytes: 39234984
num_examples: 94
- name: dev
num_bytes: 1666059
num_examples: 4
- name: test
num_bytes: 947547
num_examples: 2
download_size: 20592402
dataset_size: 41848590
- config_name: text-all
features:
- name: id
dtype: int64
- name: teo_text
dtype: string
- name: swa_text
dtype: string
- name: eng_text
dtype: string
- name: nyn_text
dtype: string
- name: ibo_text
dtype: string
- name: ach_text
dtype: string
- name: lgg_text
dtype: string
- name: lug_text
dtype: string
splits:
- name: train
num_bytes: 11763775
num_examples: 23947
- name: dev
num_bytes: 242587
num_examples: 496
- name: test
num_bytes: 253968
num_examples: 500
download_size: 7228279
dataset_size: 12260330
configs:
- config_name: multispeaker-ach
data_files:
- split: train
path: multispeaker-ach/train-*
- split: dev
path: multispeaker-ach/dev-*
- split: test
path: multispeaker-ach/test-*
- config_name: multispeaker-eng
data_files:
- split: train
path: multispeaker-eng/train-*
- split: dev
path: multispeaker-eng/dev-*
- split: test
path: multispeaker-eng/test-*
- config_name: multispeaker-lgg
data_files:
- split: train
path: multispeaker-lgg/train-*
- split: dev
path: multispeaker-lgg/dev-*
- split: test
path: multispeaker-lgg/test-*
- config_name: multispeaker-lug
data_files:
- split: train
path: multispeaker-lug/train-*
- split: dev
path: multispeaker-lug/dev-*
- split: test
path: multispeaker-lug/test-*
- config_name: multispeaker-nyn
data_files:
- split: train
path: multispeaker-nyn/train-*
- split: dev
path: multispeaker-nyn/dev-*
- split: test
path: multispeaker-nyn/test-*
- config_name: multispeaker-teo
data_files:
- split: train
path: multispeaker-teo/train-*
- split: dev
path: multispeaker-teo/dev-*
- split: test
path: multispeaker-teo/test-*
- config_name: studio-acholi
data_files:
- split: train
path: studio-acholi/train-*
- split: dev
path: studio-acholi/dev-*
- split: test
path: studio-acholi/test-*
- config_name: studio-ateso
data_files:
- split: train
path: studio-ateso/train-*
- split: dev
path: studio-ateso/dev-*
- split: test
path: studio-ateso/test-*
- config_name: studio-english
data_files:
- split: train
path: studio-english/train-*
- split: dev
path: studio-english/dev-*
- split: test
path: studio-english/test-*
- config_name: studio-luganda
data_files:
- split: train
path: studio-luganda/train-*
- split: dev
path: studio-luganda/dev-*
- split: test
path: studio-luganda/test-*
- config_name: studio-runyankole
data_files:
- split: train
path: studio-runyankole/train-*
- split: dev
path: studio-runyankole/dev-*
- split: test
path: studio-runyankole/test-*
- config_name: text-all
data_files:
- split: train
path: text-all/train-*
- split: dev
path: text-all/dev-*
- split: test
path: text-all/test-*
---
|
felipesampaio/darwin2dataset | ---
license: openrail
---
|
witchling22/tokenized_T5_base | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: source
dtype: string
- name: source_labels
dtype: string
- name: rouge_scores
dtype: string
- name: paper_id
dtype: string
- name: target
dtype: string
- name: full_source_text
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 17340567
num_examples: 1992
- name: test
num_bytes: 5620222
num_examples: 618
- name: validation
num_bytes: 5534448
num_examples: 619
download_size: 6371599
dataset_size: 28495237
---
# Dataset Card for "tokenized_T5_base"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NebulaSense/Legal_Clause_Instructions | ---
license: cc-by-nc-4.0
--- |
wangxingjun778/test_123 | ---
license: apache-2.0
---
## 1. ONLY FOR TEST
## 2. ONLY FOR TEST
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/629bee15 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 176
num_examples: 10
download_size: 1341
dataset_size: 176
---
# Dataset Card for "629bee15"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/code_instructions_standardized_cluster_8_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 32500014
num_examples: 26748
download_size: 15520517
dataset_size: 32500014
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_8_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EleutherAI/naturenet | ---
dataset_info:
features:
- name: img
dtype: image
- name: label
dtype:
class_label:
names:
'0': amphibian
'1': bird
'2': dog
'3': feline
'4': fish
'5': flower
'6': horse
'7': primate
'8': rodent
'9': snake
splits:
- name: train
num_bytes: 2195586500.24
num_examples: 490000
- name: test
num_bytes: 45820817.76
num_examples: 10000
download_size: 2188877286
dataset_size: 2241407318.0
---
# Dataset Card for "naturenet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wyx-ucl/SUM-DATASET-BASED-EDGAR-CORPUS | ---
license: other
---
|
PJMixers/oasst2_dpo_metharme_english | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: lang
dtype: string
- name: parent_id
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 32775306
num_examples: 12353
- name: validation
num_bytes: 1717916
num_examples: 668
download_size: 18679121
dataset_size: 34493222
source_datasets:
- OpenAssistant/oasst2
tags:
- dpo
- rlhf
- human-feedback
- reward
- preference
- pairwise
- pair
pretty_name: Open Assistant 2 DPO (Metharme)
size_categories:
- 10K<n<100K
---
Similar to [monology/oasst2_dpo](https://huggingface.co/datasets/monology/oasst2_dpo), except this uses Metharme tags instead. Only samples marked english were kept. |
datasets-examples/doc-image-6 | ---
size_categories:
- n<1K
---
# [doc] image dataset 6
This dataset contains 4 jpeg files in the `train/images/` subdirectory, along with a `train/metadata.csv` file that provides the data for other columns. The metadata file contains relative paths to the images. |
open-llm-leaderboard/details_l3utterfly__open-llama-3b-v2-layla | ---
pretty_name: Evaluation run of l3utterfly/open-llama-3b-v2-layla
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [l3utterfly/open-llama-3b-v2-layla](https://huggingface.co/l3utterfly/open-llama-3b-v2-layla)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_l3utterfly__open-llama-3b-v2-layla\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T08:49:03.131155](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__open-llama-3b-v2-layla/blob/main/results_2023-09-17T08-49-03.131155.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.011954697986577181,\n\
\ \"em_stderr\": 0.0011130056898859086,\n \"f1\": 0.07875629194630916,\n\
\ \"f1_stderr\": 0.0018920865515620476,\n \"acc\": 0.3194349118852447,\n\
\ \"acc_stderr\": 0.008202509803690292\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.011954697986577181,\n \"em_stderr\": 0.0011130056898859086,\n\
\ \"f1\": 0.07875629194630916,\n \"f1_stderr\": 0.0018920865515620476\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01061410159211524,\n \
\ \"acc_stderr\": 0.0028227133223877035\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6282557221783741,\n \"acc_stderr\": 0.013582306284992879\n\
\ }\n}\n```"
repo_url: https://huggingface.co/l3utterfly/open-llama-3b-v2-layla
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T08_49_03.131155
path:
- '**/details_harness|drop|3_2023-09-17T08-49-03.131155.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T08-49-03.131155.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T08_49_03.131155
path:
- '**/details_harness|gsm8k|5_2023-09-17T08-49-03.131155.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T08-49-03.131155.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:37:31.844402.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:37:31.844402.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-18T14:37:31.844402.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T08_49_03.131155
path:
- '**/details_harness|winogrande|5_2023-09-17T08-49-03.131155.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T08-49-03.131155.parquet'
- config_name: results
data_files:
- split: 2023_08_18T14_37_31.844402
path:
- results_2023-08-18T14:37:31.844402.parquet
- split: 2023_09_17T08_49_03.131155
path:
- results_2023-09-17T08-49-03.131155.parquet
- split: latest
path:
- results_2023-09-17T08-49-03.131155.parquet
---
# Dataset Card for Evaluation run of l3utterfly/open-llama-3b-v2-layla
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/l3utterfly/open-llama-3b-v2-layla
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [l3utterfly/open-llama-3b-v2-layla](https://huggingface.co/l3utterfly/open-llama-3b-v2-layla) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_l3utterfly__open-llama-3b-v2-layla",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T08:49:03.131155](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__open-llama-3b-v2-layla/blob/main/results_2023-09-17T08-49-03.131155.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.011954697986577181,
"em_stderr": 0.0011130056898859086,
"f1": 0.07875629194630916,
"f1_stderr": 0.0018920865515620476,
"acc": 0.3194349118852447,
"acc_stderr": 0.008202509803690292
},
"harness|drop|3": {
"em": 0.011954697986577181,
"em_stderr": 0.0011130056898859086,
"f1": 0.07875629194630916,
"f1_stderr": 0.0018920865515620476
},
"harness|gsm8k|5": {
"acc": 0.01061410159211524,
"acc_stderr": 0.0028227133223877035
},
"harness|winogrande|5": {
"acc": 0.6282557221783741,
"acc_stderr": 0.013582306284992879
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
sanchit-gandhi/librispeech_asr_dummy_pseudo_labelled | ---
dataset_info:
config_name: clean
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: validation
num_bytes: 9700520.0
num_examples: 73
download_size: 9198584
dataset_size: 9700520.0
configs:
- config_name: clean
data_files:
- split: validation
path: clean/validation-*
---
|
adamjweintraut/kwsylgen | ---
dataset_info:
features:
- name: midi_id
dtype: string
- name: song_title
dtype: string
- name: lyrics
dtype: string
- name: genre
dtype: string
- name: lyric_summary_bartv2
dtype: string
- name: topic
dtype: string
- name: clean_lyrics
dtype: string
- name: lyric_chunk_n
dtype: int64
- name: prev_clean_lyrics
dtype: string
- name: sylls
dtype: int64
- name: keywords
sequence: string
- name: orig
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 358889793.8653989
num_examples: 178354
- name: test
num_bytes: 47312159.64738172
num_examples: 23126
- name: valid
num_bytes: 47185445.65799565
num_examples: 23683
download_size: 21206048
dataset_size: 453387399.17077625
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
EleutherAI/headqa | ---
license: other
--- |
CyberHarem/shiina_mahiru_otonarinotenshisamaniitsunomanikadameningennisareteitaken | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Shiina Mahiru/椎名真昼 (Otonari no Tenshi-sama ni Itsunomanika Dame Ningen ni Sareteita Ken)
This is the dataset of Shiina Mahiru/椎名真昼 (Otonari no Tenshi-sama ni Itsunomanika Dame Ningen ni Sareteita Ken), containing 463 images and their tags.
The core tags of this character are `long_hair, blonde_hair, yellow_eyes, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 463 | 399.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiina_mahiru_otonarinotenshisamaniitsunomanikadameningennisareteitaken/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 463 | 399.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiina_mahiru_otonarinotenshisamaniitsunomanikadameningennisareteitaken/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 830 | 656.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiina_mahiru_otonarinotenshisamaniitsunomanikadameningennisareteitaken/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shiina_mahiru_otonarinotenshisamaniitsunomanikadameningennisareteitaken',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, blush, open_mouth, portrait, solo, close-up, looking_at_viewer, :d, :o |
| 1 | 10 |  |  |  |  |  | 1girl, closed_mouth, portrait, solo, blush, looking_at_viewer, smile, close-up |
| 2 | 9 |  |  |  |  |  | 1girl, portrait, shirt, solo, blush, indoors, closed_mouth, collarbone, looking_at_viewer |
| 3 | 7 |  |  |  |  |  | 1girl, collared_shirt, open_mouth, solo, white_shirt, blush, indoors, upper_body, curtains, portrait |
| 4 | 9 |  |  |  |  |  | 1girl, closed_mouth, solo, blush, portrait, profile, white_shirt, collared_shirt, from_side |
| 5 | 6 |  |  |  |  |  | 1girl, closed_mouth, collared_shirt, indoors, long_sleeves, looking_at_viewer, smile, solo, upper_body, white_shirt, dutch_angle, holding, apron, whisk |
| 6 | 5 |  |  |  |  |  | 1girl, ponytail, portrait, sidelocks, solo, closed_mouth, shirt, smile, collarbone, anime_coloring, apron, grey_background, simple_background |
| 7 | 7 |  |  |  |  |  | 1girl, indoors, ponytail, sidelocks, smile, solo, blush, collarbone, open_mouth, purple_shirt, very_long_hair, yellow_apron, long_sleeves, standing, white_skirt |
| 8 | 9 |  |  |  |  |  | 1girl, sidelocks, solo, braid, collarbone, white_shirt, open_mouth, upper_body, blue_ribbon, blunt_bangs, single_hair_bun |
| 9 | 24 |  |  |  |  |  | 1girl, school_uniform, white_shirt, collared_shirt, solo, blazer, red_bowtie, closed_mouth, upper_body, blue_jacket, blurry_background, looking_at_viewer, outdoors, blush |
| 10 | 6 |  |  |  |  |  | 1girl, hair_scrunchie, ponytail, sidelocks, blue_scrunchie, closed_mouth, solo, blush, indoors, pink_shirt, profile, upper_body |
| 11 | 7 |  |  |  |  |  | 1girl, solo_focus, upper_body, 2girls, closed_mouth, collared_shirt, looking_at_viewer, white_shirt, brown_hair, collarbone, blush, pajamas, simple_background, smile |
| 12 | 5 |  |  |  |  |  | 1girl, blazer, brown_skirt, long_sleeves, plaid_skirt, school_uniform, very_long_hair, 2boys, blue_jacket, shoes, smile, solo_focus, white_shirt, 1boy, head_out_of_frame, pants, pleated_skirt, red_bowtie |
| 13 | 7 |  |  |  |  |  | 1girl, braid, cherry_blossoms, hair_bow, long_sleeves, looking_at_viewer, outdoors, smile, solo, upper_body, white_shirt, closed_mouth, petals, tree, from_side, looking_back, flower, very_long_hair, floating_hair |
| 14 | 5 |  |  |  |  |  | 1girl, blanket, pillow, under_covers, blush, closed_mouth, indoors, looking_at_viewer, on_bed, on_back, solo_focus |
| 15 | 8 |  |  |  |  |  | 1girl, pink_kimono, sidelocks, closed_mouth, hair_flower, blush, floral_print, looking_at_viewer, obi, upper_body, long_sleeves, solo_focus |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | open_mouth | portrait | solo | close-up | looking_at_viewer | :d | :o | closed_mouth | smile | shirt | indoors | collarbone | collared_shirt | white_shirt | upper_body | curtains | profile | from_side | long_sleeves | dutch_angle | holding | apron | whisk | ponytail | sidelocks | anime_coloring | grey_background | simple_background | purple_shirt | very_long_hair | yellow_apron | standing | white_skirt | braid | blue_ribbon | blunt_bangs | single_hair_bun | school_uniform | blazer | red_bowtie | blue_jacket | blurry_background | outdoors | hair_scrunchie | blue_scrunchie | pink_shirt | solo_focus | 2girls | brown_hair | pajamas | brown_skirt | plaid_skirt | 2boys | shoes | 1boy | head_out_of_frame | pants | pleated_skirt | cherry_blossoms | hair_bow | petals | tree | looking_back | flower | floating_hair | blanket | pillow | under_covers | on_bed | on_back | pink_kimono | hair_flower | floral_print | obi |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:-------------|:-----------|:-------|:-----------|:--------------------|:-----|:-----|:---------------|:--------|:--------|:----------|:-------------|:-----------------|:--------------|:-------------|:-----------|:----------|:------------|:---------------|:--------------|:----------|:--------|:--------|:-----------|:------------|:-----------------|:------------------|:--------------------|:---------------|:-----------------|:---------------|:-----------|:--------------|:--------|:--------------|:--------------|:------------------|:-----------------|:---------|:-------------|:--------------|:--------------------|:-----------|:-----------------|:-----------------|:-------------|:-------------|:---------|:-------------|:----------|:--------------|:--------------|:--------|:--------|:-------|:--------------------|:--------|:----------------|:------------------|:-----------|:---------|:-------|:---------------|:---------|:----------------|:----------|:---------|:---------------|:---------|:----------|:--------------|:--------------|:---------------|:------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | | X | X | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | | X | X | | X | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | X | X | | | | | | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | | X | X | | | | | X | | | | | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | | | X | | X | | | X | X | | X | | X | X | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | X | X | | | | | X | X | X | | X | | | | | | | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | X | X | | X | | | | | | X | | X | X | | | | | | | X | | | | | X | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | X | | X | | X | | | | | | | | | X | | X | X | | | | | | | | | | X | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 24 |  |  |  |  |  | X | X | | | X | | X | | | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 6 |  |  |  |  |  | X | X | | | X | | | | | X | | | X | | | | X | | X | | | | | | | X | X | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 7 |  |  |  |  |  | X | X | | | | | X | | | X | X | | | X | X | X | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 5 |  |  |  |  |  | X | | | | | | | | | | X | | | | | X | | | | | X | | | | | | | | | | | X | | | | | | | | X | X | X | X | | | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 13 | 7 |  |  |  |  |  | X | | | | X | | X | | | X | X | | | | | X | X | | | X | X | | | | | | | | | | | X | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | |
| 14 | 5 |  |  |  |  |  | X | X | | | | | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | |
| 15 | 8 |  |  |  |  |  | X | X | | | | | X | | | X | | | | | | | X | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X |
|
projecte-aina/vilaquad | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- ca
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
pretty_name: VilaQuAD
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- extractive-qa
---
# Dataset Card for VilaQuAD
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://doi.org/10.5281/zenodo.4562337
- **Paper:** [Are Multilingual Models the Best Choice for Moderately Under-resourced Languages? A Comprehensive Assessment for Catalan](https://arxiv.org/abs/2107.07903)
- **Point of Contact:** langtech@bsc.es
### Dataset Summary
VilaQuAD, An extractive QA dataset for Catalan, from [VilaWeb](https://www.vilaweb.cat/) newswire text.
This dataset contains 2095 of Catalan language news articles along with 1 to 5 questions referring to each fragment (or context).
VilaQuad articles are extracted from the daily [VilaWeb](https://www.vilaweb.cat/) and used under [CC-BY-NC-SA-ND](https://creativecommons.org/licenses/by-nc-nd/3.0/deed.ca) licence.
This dataset can be used to build extractive-QA and Language Models.
### Supported Tasks and Leaderboards
Extractive-QA, Language Model.
### Languages
The dataset is in Catalan (`ca-ES`).
## Dataset Structure
### Data Instances
```
{
'id': 'P_556_C_556_Q1',
'title': "El Macba posa en qüestió l'eufòria amnèsica dels anys vuitanta a l'estat espanyol",
'context': "El Macba ha obert una nova exposició, 'Gelatina dura. Històries escamotejades dels 80', dedicada a revisar el discurs hegemònic que es va instaurar en aquella dècada a l'estat espanyol, concretament des del començament de la transició, el 1977, fins a la fita de Barcelona 92. És una mirada en clau espanyola, però també centralista, perquè més enllà dels esdeveniments ocorreguts a Catalunya i els artistes que els van combatre, pràcticament només s'hi mostren fets polítics i culturals generats des de Madrid. No es parla del País Basc, per exemple. Però, dit això, l'exposició revisa aquesta dècada de la història recent tot qüestionant un triomfalisme homogeneïtzador, que ja se sap que va arrasar una gran quantitat de sectors crítics i radicals de l'àmbit social, polític i cultural. Com diu la comissària, Teresa Grandas, de l'equip del Macba: 'El relat oficial dels anys vuitanta a l'estat espanyol va prioritzar la necessitat per damunt de la raó i va consolidar una mirada que privilegiava el futur abans que l'anàlisi del passat recent, obviant qualsevol consideració crítica respecte de la filiació amb el poder franquista.",
'question': 'Com es diu la nova exposició que ha obert el Macba?',
'answers': [
{
'text': "'Gelatina dura. Històries escamotejades dels 80'",
'answer_start': 38
}
]
}
```
### Data Fields
Follows [Rajpurkar, Pranav et al., (2016)](http://arxiv.org/abs/1606.05250) for SQuAD v1 datasets.
- `id` (str): Unique ID assigned to the question.
- `title` (str): Title of the VilaWeb article.
- `context` (str): VilaWeb section text.
- `question` (str): Question.
- `answers` (list): List of answers to the question, each containing:
- `text` (str): Span text answering to the question.
- `answer_start` Starting offset of the span text answering to the question.
### Data Splits
- train.json: 1295 contexts, 3882 questions
- dev.json: 400 contexts, 1200 questions
- test.json: 400 contexts, 1200 questions
## Dataset Creation
### Curation Rationale
We created this dataset to contribute to the development of language models in Catalan, a low-resource language.
### Source Data
- [VilaWeb site](https://www.vilaweb.cat/)
#### Initial Data Collection and Normalization
The source data are scraped articles from archives of Catalan newspaper website [Vilaweb](https://www.vilaweb.cat).
From a the online edition of the newspaper [VilaWeb](https://www.vilaweb.cat), 2095 articles were randomnly selected. These headlines were also used to create a Textual Entailment dataset. For the extractive QA dataset, creation of between 1 and 5 questions for each news context was commissioned, following an adaptation of the guidelines from SQuAD 1.0 ([Rajpurkar, Pranav et al. (2016)](http://arxiv.org/abs/1606.05250)). In total, 6282 pairs of a question and an extracted fragment that contains the answer were created.
For compatibility with similar datasets in other languages, we followed as close as possible existing curation guidelines. We also created [another QA dataset with wikipedia](https://huggingface.co/datasets/projecte-aina/viquiquad) to ensure thematic and stylistic variety.
#### Who are the source language producers?
CA
Professional journalists from the Catalan newspaper [VilaWeb](https://www.vilaweb.cat/).
### Annotations
#### Annotation process
We comissioned the creation of 1 to 5 questions for each context, following an adaptation of the guidelines from SQuAD 1.0 ([Rajpurkar, Pranav et al. (2016)](http://arxiv.org/abs/1606.05250)).
#### Who are the annotators?
Annotation was commissioned to an specialized company that hired a team of native language speakers.
### Personal and Sensitive Information
No personal or sensitive information included.
## Considerations for Using the Data
### Social Impact of Dataset
We hope this dataset contributes to the development of language models in Catalan, a low-resource language.
### Discussion of Biases
[N/A]
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
Text Mining Unit (TeMU) at the Barcelona Supercomputing Center (bsc-temu@bsc.es)
This work was funded by the [Departament de la Vicepresidència i de Polítiques Digitals i Territori de la Generalitat de Catalunya](https://politiquesdigitals.gencat.cat/en/inici/index.html) within the framework of [Projecte AINA](https://politiquesdigitals.gencat.cat/ca/economia/catalonia-ai/aina/).
### Licensing Information
This work is licensed under a <a rel="license" href="https://creativecommons.org/licenses/by-sa/4.0/">Attribution-ShareAlike 4.0 International License</a>.
### Citation Information
```
@inproceedings{armengol-estape-etal-2021-multilingual,
title = "Are Multilingual Models the Best Choice for Moderately Under-resourced Languages? {A} Comprehensive Assessment for {C}atalan",
author = "Armengol-Estap{\'e}, Jordi and
Carrino, Casimiro Pio and
Rodriguez-Penagos, Carlos and
de Gibert Bonet, Ona and
Armentano-Oller, Carme and
Gonzalez-Agirre, Aitor and
Melero, Maite and
Villegas, Marta",
booktitle = "Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.findings-acl.437",
doi = "10.18653/v1/2021.findings-acl.437",
pages = "4933--4946",
}
```
[DOI](https://doi.org/10.5281/zenodo.4562337)
### Contributions
[N/A] |
AdapterOcean/Open_Platypus_standardized_cluster_8_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 3469650
num_examples: 3068
download_size: 1775180
dataset_size: 3469650
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Open_Platypus_standardized_cluster_8_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pravsels/manim_3b1b_code | ---
dataset_info:
features:
- name: file_path
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 887937
num_examples: 123
download_size: 336530
dataset_size: 887937
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AsphyXIA/Baarat-Kan-Summarization | ---
dataset_info:
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 97489736
num_examples: 140890
download_size: 40315874
dataset_size: 97489736
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.