datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
INSAIT-Institute/arc-easy-bgeval | ---
license: cc-by-sa-4.0
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: train
num_bytes: 1041020
num_examples: 2251
- name: test
num_bytes: 1106644
num_examples: 2376
- name: validation
num_bytes: 264848
num_examples: 570
download_size: 1094042
dataset_size: 2412512
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
rsilveira79/soprano_dpo_pairs | ---
license: apache-2.0
dataset_info:
features:
- name: question
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 1026797
num_examples: 500
download_size: 638927
dataset_size: 1026797
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kunishou/hh-rlhf-49k-ja-single-turn | ---
license: mit
---
This dataset was created by automatically translating part of "Anthropic/hh-rlhf" into Japanese, and selected for single turn conversations.
You can use this dataset for RLHF and DPO.
hh-rlhf repository
https://github.com/anthropics/hh-rlhf
Anthropic/hh-rlhf
https://huggingface.co/datasets/Anthropic/hh-rlhf |
Seanxh/twitter_dataset_1713208066 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 152562
num_examples: 357
download_size: 56463
dataset_size: 152562
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
carnival13/xlmr_int_hard_curr_trn_ep2_lrg | ---
dataset_info:
features:
- name: domain_label
dtype: int64
- name: pass_label
dtype: int64
- name: input
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 285070021
num_examples: 226100
download_size: 80645458
dataset_size: 285070021
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "xlmr_int_hard_curr_trn_ep2_lrg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HumanCompatibleAI/ppo-seals-Humanoid-v1 | ---
dataset_info:
features:
- name: obs
sequence:
sequence: float64
- name: acts
sequence:
sequence: float32
- name: infos
sequence: string
- name: terminal
dtype: bool
- name: rews
sequence: float32
splits:
- name: train
num_bytes: 447344692
num_examples: 104
download_size: 244295905
dataset_size: 447344692
---
# Dataset Card for "ppo-seals-Humanoid-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alexthomas4/highsub-classification | ---
dataset_info:
features:
- name: image
dtype: image
- name: image_url
dtype: string
- name: id
dtype: string
- name: label
dtype:
class_label:
names:
'0': rarity:common
'1': rarity:uncommon
'2': rarity:rare
'3': rarity:super_rare
'4': rarity:ultra_rare
splits:
- name: train
num_bytes: 11681495727.622
num_examples: 5994
download_size: 9233260171
dataset_size: 11681495727.622
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SachinPatel248/mqnli | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: question
dtype: string
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': not_entailment
- name: translated_question_lang
dtype: string
- name: translated_sentence_lang
dtype: string
- name: translated_question
dtype: string
- name: translated_sentence
dtype: string
splits:
- name: train
num_bytes: 54987341
num_examples: 103059
download_size: 39711768
dataset_size: 54987341
task_categories:
- text-classification
language:
- en
- de
- es
- ar
- zh
- hi
- pt
- ru
- ja
- fr
- ur
- tr
- ko
- pl
- it
- sv
pretty_name: Multilingual qnli (from GLUE)
size_categories:
- 10K<n<100K
--- |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/26f0dd27 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1331
dataset_size: 182
---
# Dataset Card for "26f0dd27"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dhruv107/docs_pro_max_all_combined_image_Mar_5 | ---
license: apache-2.0
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 1316524254.0
num_examples: 884
- name: validation
num_bytes: 243796725.0
num_examples: 166
- name: test
num_bytes: 82502179.0
num_examples: 56
download_size: 1639323383
dataset_size: 1642823158.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Edsodre/xuxa | ---
license: openrail
---
|
AlekseyKorshuk/DotCHA-100k-preprocessed | ---
dataset_info:
features:
- name: letter
sequence: int64
- name: buckets
sequence:
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1685564572
num_examples: 100000
download_size: 1471149713
dataset_size: 1685564572
---
# Dataset Card for "DotCHA-100k-preprocessed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
oliverjthomas2000/finetune-test | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 8756
num_examples: 199
download_size: 1363
dataset_size: 8756
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-FP16 | ---
pretty_name: Evaluation run of The-Face-Of-Goonery/Huginn-13b-FP16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [The-Face-Of-Goonery/Huginn-13b-FP16](https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-FP16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-FP16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T23:23:06.857366](https://huggingface.co/datasets/open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-FP16/blob/main/results_2023-10-17T23-23-06.857366.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.33609479865771813,\n\
\ \"em_stderr\": 0.004837529011799984,\n \"f1\": 0.41438129194631024,\n\
\ \"f1_stderr\": 0.004663694796707255,\n \"acc\": 0.39019449213217305,\n\
\ \"acc_stderr\": 0.008985955021249931\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.33609479865771813,\n \"em_stderr\": 0.004837529011799984,\n\
\ \"f1\": 0.41438129194631024,\n \"f1_stderr\": 0.004663694796707255\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.043214556482183475,\n \
\ \"acc_stderr\": 0.005600987515237852\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7371744277821626,\n \"acc_stderr\": 0.01237092252726201\n\
\ }\n}\n```"
repo_url: https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-FP16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|arc:challenge|25_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T23_23_06.857366
path:
- '**/details_harness|drop|3_2023-10-17T23-23-06.857366.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T23-23-06.857366.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T23_23_06.857366
path:
- '**/details_harness|gsm8k|5_2023-10-17T23-23-06.857366.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T23-23-06.857366.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hellaswag|10_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T13:30:49.317288.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T13:30:49.317288.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T13:30:49.317288.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T23_23_06.857366
path:
- '**/details_harness|winogrande|5_2023-10-17T23-23-06.857366.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T23-23-06.857366.parquet'
- config_name: results
data_files:
- split: 2023_08_09T13_30_49.317288
path:
- results_2023-08-09T13:30:49.317288.parquet
- split: 2023_10_17T23_23_06.857366
path:
- results_2023-10-17T23-23-06.857366.parquet
- split: latest
path:
- results_2023-10-17T23-23-06.857366.parquet
---
# Dataset Card for Evaluation run of The-Face-Of-Goonery/Huginn-13b-FP16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-FP16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [The-Face-Of-Goonery/Huginn-13b-FP16](https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-FP16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-FP16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T23:23:06.857366](https://huggingface.co/datasets/open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-FP16/blob/main/results_2023-10-17T23-23-06.857366.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.33609479865771813,
"em_stderr": 0.004837529011799984,
"f1": 0.41438129194631024,
"f1_stderr": 0.004663694796707255,
"acc": 0.39019449213217305,
"acc_stderr": 0.008985955021249931
},
"harness|drop|3": {
"em": 0.33609479865771813,
"em_stderr": 0.004837529011799984,
"f1": 0.41438129194631024,
"f1_stderr": 0.004663694796707255
},
"harness|gsm8k|5": {
"acc": 0.043214556482183475,
"acc_stderr": 0.005600987515237852
},
"harness|winogrande|5": {
"acc": 0.7371744277821626,
"acc_stderr": 0.01237092252726201
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yzhuang/autotree_snnxor_n30_l2_2 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 402200000
num_examples: 10000
- name: validation
num_bytes: 402200000
num_examples: 10000
- name: test
num_bytes: 402200000
num_examples: 10000
download_size: 351933707
dataset_size: 1206600000
---
# Dataset Card for "autotree_snnxor_n30_l2_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FanChen0116/few7_19100_chat_time8x | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-time
'2': B-date
'3': B-last_name
'4': B-people
'5': I-date
'6': I-people
'7': I-last_name
'8': I-first_name
'9': B-first_name
'10': B-time
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 102830
num_examples: 570
- name: validation
num_bytes: 998
num_examples: 6
- name: test
num_bytes: 646729
num_examples: 3731
download_size: 0
dataset_size: 750557
---
# Dataset Card for "few7_19100_chat_time8x"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DZN111/cucu | ---
license: openrail
---
|
open-llm-leaderboard/details_shadowml__Marcoro14-7B-slerp | ---
pretty_name: Evaluation run of mlabonne/Marcoro14-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mlabonne/Marcoro14-7B-slerp](https://huggingface.co/mlabonne/Marcoro14-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__Marcoro14-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T17:07:52.198441](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__Marcoro14-7B-slerp/blob/main/results_2023-12-30T17-07-52.198441.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6557670960374431,\n\
\ \"acc_stderr\": 0.031998348451839013,\n \"acc_norm\": 0.6555797586821419,\n\
\ \"acc_norm_stderr\": 0.032660366522478446,\n \"mc1\": 0.4724602203182375,\n\
\ \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.6354053076486196,\n\
\ \"mc2_stderr\": 0.015212905778062237\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6749146757679181,\n \"acc_stderr\": 0.013688147309729125,\n\
\ \"acc_norm\": 0.6979522184300341,\n \"acc_norm_stderr\": 0.01341751914471641\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6919936267675761,\n\
\ \"acc_stderr\": 0.004607256752931883,\n \"acc_norm\": 0.8713403704441346,\n\
\ \"acc_norm_stderr\": 0.003341385493187586\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474887,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474887\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"\
acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857413,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857413\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8532110091743119,\n \"acc_stderr\": 0.01517314184512625,\n \"\
acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.01517314184512625\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n\
\ \"acc_stderr\": 0.013223928616741622,\n \"acc_norm\": 0.8365261813537676,\n\
\ \"acc_norm_stderr\": 0.013223928616741622\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992005,\n\
\ \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992005\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.423463687150838,\n\
\ \"acc_stderr\": 0.016525425898773493,\n \"acc_norm\": 0.423463687150838,\n\
\ \"acc_norm_stderr\": 0.016525425898773493\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083135,\n\
\ \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083135\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"\
acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.01879808628488689,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.01879808628488689\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4724602203182375,\n\
\ \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.6354053076486196,\n\
\ \"mc2_stderr\": 0.015212905778062237\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8161010260457774,\n \"acc_stderr\": 0.01088791601330589\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7088703563305534,\n \
\ \"acc_stderr\": 0.012513215297888463\n }\n}\n```"
repo_url: https://huggingface.co/mlabonne/Marcoro14-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|arc:challenge|25_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|gsm8k|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hellaswag|10_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T17-07-52.198441.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T17-07-52.198441.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- '**/details_harness|winogrande|5_2023-12-30T17-07-52.198441.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T17-07-52.198441.parquet'
- config_name: results
data_files:
- split: 2023_12_30T17_07_52.198441
path:
- results_2023-12-30T17-07-52.198441.parquet
- split: latest
path:
- results_2023-12-30T17-07-52.198441.parquet
---
# Dataset Card for Evaluation run of mlabonne/Marcoro14-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/Marcoro14-7B-slerp](https://huggingface.co/mlabonne/Marcoro14-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__Marcoro14-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T17:07:52.198441](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__Marcoro14-7B-slerp/blob/main/results_2023-12-30T17-07-52.198441.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6557670960374431,
"acc_stderr": 0.031998348451839013,
"acc_norm": 0.6555797586821419,
"acc_norm_stderr": 0.032660366522478446,
"mc1": 0.4724602203182375,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.6354053076486196,
"mc2_stderr": 0.015212905778062237
},
"harness|arc:challenge|25": {
"acc": 0.6749146757679181,
"acc_stderr": 0.013688147309729125,
"acc_norm": 0.6979522184300341,
"acc_norm_stderr": 0.01341751914471641
},
"harness|hellaswag|10": {
"acc": 0.6919936267675761,
"acc_stderr": 0.004607256752931883,
"acc_norm": 0.8713403704441346,
"acc_norm_stderr": 0.003341385493187586
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474887,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474887
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857413,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857413
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.01517314184512625,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.01517314184512625
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741622,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741622
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992005,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992005
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.423463687150838,
"acc_stderr": 0.016525425898773493,
"acc_norm": 0.423463687150838,
"acc_norm_stderr": 0.016525425898773493
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700856,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700856
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083135,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083135
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.01879808628488689,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.01879808628488689
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4724602203182375,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.6354053076486196,
"mc2_stderr": 0.015212905778062237
},
"harness|winogrande|5": {
"acc": 0.8161010260457774,
"acc_stderr": 0.01088791601330589
},
"harness|gsm8k|5": {
"acc": 0.7088703563305534,
"acc_stderr": 0.012513215297888463
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
huanngzh/anime_face_control_60k | ---
dataset_info:
features:
- name: item_id
dtype: string
- name: prompt
dtype: string
- name: blip_caption
dtype: string
- name: landmarks
sequence:
sequence: float64
- name: source
dtype: image
- name: target
dtype: image
- name: visual
dtype: image
- name: origin_path
dtype: string
- name: source_path
dtype: string
- name: target_path
dtype: string
- name: visual_path
dtype: string
splits:
- name: train
num_bytes: 5359477272.0
num_examples: 60000
download_size: 0
dataset_size: 5359477272.0
---
# Dataset Card for "acgn_face_control_60k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mekaneeky/Synthetic_Acholi_VITS_22.5k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: eng
dtype: string
- name: lug
dtype: string
- name: ach
dtype: string
- name: teo
dtype: string
- name: lgg
dtype: string
- name: nyn
dtype: string
- name: ID
dtype: string
- name: ach_tts
sequence:
sequence: float32
splits:
- name: train
num_bytes: 17816721728
num_examples: 23947
- name: dev
num_bytes: 361145932
num_examples: 500
- name: test
num_bytes: 375082248
num_examples: 500
download_size: 18567936006
dataset_size: 18552949908
---
# Dataset Card for "Synthetic_Acholi_VITS_22.5k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
flaviolima/coringaa | ---
license: openrail
---
|
senhorsapo/subaru | ---
license: openrail
---
|
yangyz1230/H4 | ---
dataset_info:
features:
- name: name
dtype: string
- name: sequence
dtype: string
- name: chrom
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: strand
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 319081
num_examples: 566
- name: test
num_bytes: 39314
num_examples: 70
download_size: 178521
dataset_size: 358395
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Francesco/stomata-cells | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': stomata-cells
'1': close
'2': open
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: stomata-cells
tags:
- rf100
---
# Dataset Card for stomata-cells
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/stomata-cells
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
stomata-cells
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/stomata-cells
### Citation Information
```
@misc{ stomata-cells,
title = { stomata cells Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/stomata-cells } },
url = { https://universe.roboflow.com/object-detection/stomata-cells },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
alirahebi/no_robots | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: category
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 28805395
num_examples: 9500
- name: test
num_bytes: 1545168
num_examples: 500
download_size: 18891461
dataset_size: 30350563
---
# Dataset Card for "no_robots"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chompk/tydiqa-goldp-th | ---
pretty_name: TyDiQA-GoldP-Th
language:
- th
task_categories:
- question-answering
task_ids:
- extractive-qa
configs:
- config_name: default
data_files:
- split: train
path: tydiqa.goldp.th.train.json
- split: dev
path: tydiqa.goldp.th.dev.json
---
# TyDiQA-GoldP-Th
This dataset contains a removed Thai TyDiQA dataset obtained from [Khalidalt's TyDiQA Dataset](https://huggingface.co/datasets/khalidalt/tydiqa-goldp).
This dataset version does the following additional preprocessing to the dataset
1. Convert byte-level index into character-level index
2. Fix any mismatch text between answer span and actual text
3. Re-split train/development set such that there's no leakage in context passage
4. Deduplicate questions from the same context passage
## Dataset Format
The dataset is formatted to make it compatible to [XTREME benchmark](https://github.com/google-research/xtreme) format. The data is formatted as the following pattern:
```json
{
"version": "TyDiQA-GoldP-1.1-for-SQuAD-1.1",
"data": [
{
"paragrahs": [{
"context": [PASSAGE CONTEXT HERE],
"qas": [{
"answers": [{
"answer_start": [CONTEXT START CHAR INDEX OF ANSWER],
"text": [TEXT SPAN FROM CONTEXT],
}],
"question": [QUESTION],
"id": [ID]
}]
}],
},
...
]
}
```
## Author
Chompakorn Chaksangchaichot |
distilled-from-one-sec-cv12/chunk_107 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 895508340
num_examples: 174495
download_size: 913736256
dataset_size: 895508340
---
# Dataset Card for "chunk_107"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ali-C137/ArabicGuanaco-X-DSD-Dataset | ---
dataset_info:
features:
- name: Text
dtype: string
splits:
- name: train
num_bytes: 497099787
num_examples: 15988
download_size: 251298896
dataset_size: 497099787
---
# Dataset Card for "ArabicGuanaco-X-DSD-4PolyLM-Dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zpn/bbbp | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
license:
- mit
multilinguality:
- monolingual
pretty_name: bbbp
size_categories:
- 1K<n<10K
source_datasets: []
tags:
- bio
- bio-chem
- molnet
- molecule-net
- biophysics
task_categories:
- other
task_ids: []
---
# Dataset Card for bbbp
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage: https://moleculenet.org/**
- **Repository: https://github.com/deepchem/deepchem/tree/master**
- **Paper: https://arxiv.org/abs/1703.00564**
### Dataset Summary
`bbbp` is a dataset included in [MoleculeNet](https://moleculenet.org/). This dataset has binary labels of blood-brain barrier penetration(permeability).
## Dataset Structure
### Data Fields
Each split contains
* `smiles`: the [SMILES](https://en.wikipedia.org/wiki/Simplified_molecular-input_line-entry_system) representation of a molecule
* `selfies`: the [SELFIES](https://github.com/aspuru-guzik-group/selfies) representation of a molecule
* `target`: blood-brain barrier penetration(permeability)
### Data Splits
The dataset is split into an 80/10/10 train/valid/test split using scaffold split.
### Source Data
#### Initial Data Collection and Normalization
Data was originially generated by the Pande Group at Standford
### Licensing Information
This dataset was originally released under an MIT license
### Citation Information
```
@misc{https://doi.org/10.48550/arxiv.1703.00564,
doi = {10.48550/ARXIV.1703.00564},
url = {https://arxiv.org/abs/1703.00564},
author = {Wu, Zhenqin and Ramsundar, Bharath and Feinberg, Evan N. and Gomes, Joseph and Geniesse, Caleb and Pappu, Aneesh S. and Leswing, Karl and Pande, Vijay},
keywords = {Machine Learning (cs.LG), Chemical Physics (physics.chem-ph), Machine Learning (stat.ML), FOS: Computer and information sciences, FOS: Computer and information sciences, FOS: Physical sciences, FOS: Physical sciences},
title = {MoleculeNet: A Benchmark for Molecular Machine Learning},
publisher = {arXiv},
year = {2017},
copyright = {arXiv.org perpetual, non-exclusive license}
}
```
### Contributions
Thanks to [@zanussbaum](https://github.com/zanussbaum) for adding this dataset.
|
open-llm-leaderboard/details_stabilityai__stablelm-3b-4e1t | ---
pretty_name: Evaluation run of stabilityai/stablelm-3b-4e1t
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [stabilityai/stablelm-3b-4e1t](https://huggingface.co/stabilityai/stablelm-3b-4e1t)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_stabilityai__stablelm-3b-4e1t_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-08T16:27:49.205374](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-3b-4e1t_public/blob/main/results_2023-11-08T16-27-49.205374.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.00041913301788267703,\n \"f1\": 0.053592701342281994,\n\
\ \"f1_stderr\": 0.001271488426848693,\n \"acc\": 0.3726382606707983,\n\
\ \"acc_stderr\": 0.008837083686710946\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788267703,\n\
\ \"f1\": 0.053592701342281994,\n \"f1_stderr\": 0.001271488426848693\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03335860500379075,\n \
\ \"acc_stderr\": 0.004946282649173774\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7119179163378059,\n \"acc_stderr\": 0.012727884724248116\n\
\ }\n}\n```"
repo_url: https://huggingface.co/stabilityai/stablelm-3b-4e1t
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_11_08T16_27_49.205374
path:
- '**/details_harness|drop|3_2023-11-08T16-27-49.205374.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-08T16-27-49.205374.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_08T16_27_49.205374
path:
- '**/details_harness|gsm8k|5_2023-11-08T16-27-49.205374.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-08T16-27-49.205374.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_08T16_27_49.205374
path:
- '**/details_harness|winogrande|5_2023-11-08T16-27-49.205374.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-08T16-27-49.205374.parquet'
- config_name: results
data_files:
- split: 2023_11_08T16_27_49.205374
path:
- results_2023-11-08T16-27-49.205374.parquet
- split: latest
path:
- results_2023-11-08T16-27-49.205374.parquet
---
# Dataset Card for Evaluation run of stabilityai/stablelm-3b-4e1t
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/stabilityai/stablelm-3b-4e1t
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [stabilityai/stablelm-3b-4e1t](https://huggingface.co/stabilityai/stablelm-3b-4e1t) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_stabilityai__stablelm-3b-4e1t_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-08T16:27:49.205374](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-3b-4e1t_public/blob/main/results_2023-11-08T16-27-49.205374.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788267703,
"f1": 0.053592701342281994,
"f1_stderr": 0.001271488426848693,
"acc": 0.3726382606707983,
"acc_stderr": 0.008837083686710946
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788267703,
"f1": 0.053592701342281994,
"f1_stderr": 0.001271488426848693
},
"harness|gsm8k|5": {
"acc": 0.03335860500379075,
"acc_stderr": 0.004946282649173774
},
"harness|winogrande|5": {
"acc": 0.7119179163378059,
"acc_stderr": 0.012727884724248116
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mstz/spect | ---
language:
- en
tags:
- spect
- tabular_classification
- binary_classification
- UCI
pretty_name: Ozone
size_categories:
- n<1K
task_categories:
- tabular-classification
configs:
- spect
- spectf
license: cc
---
# Ozone
The [Ozone dataset](https://archive.ics.uci.edu/ml/datasets/Ozone) from the [UCI ML repository](https://archive.ics.uci.edu/ml/datasets).
# Configurations and tasks
| **Configuration** | **Task** | **Description** |
|-------------------|---------------------------|-------------------------|
| spect | Binary classification | Is there an ozone layer?|
| spectf | Binary classification | Is there an ozone layer?|
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/spect", "spect")["train"]
``` |
LanguageBind/Open-Sora-Plan-v1.0.0 | ---
license: mit
---
# Open-Sora-Dataset
Welcome to the Open-Sora-DataSet project! As part of the [Open-Sora-Plan](https://github.com/PKU-YuanGroup/Open-Sora-Plan) project, we specifically talk about the collection and processing of data sets. To build a high-quality video dataset for the open-source world, we started this project. 💪
We warmly welcome you to join us! Let's contribute to the open-source world together! Thank you for your support and contribution.
**If you like our project, please give us a star ⭐ on [GitHub](https://github.com/PKU-YuanGroup/Open-Sora-Plan) for latest update.**
欢迎来到Open-Sora-DataSet项目!我们作为Open-Sora—Plan项目的一部分,详细阐述数据集的收集和处理。为给开源世界构建一个高质量的视频数据,我们发起了这个项目。💪
我们非常欢迎您的加入!让我们共同为开源的世界贡献力量!感谢您的支持和贡献。
如果你喜欢我们的项目,请为我们的[项目](https://github.com/PKU-YuanGroup/Open-Sora-Plan)支持点赞!
## Data Construction for Open-Sora-Plan v1.0.0
### Data distribution
we crawled 40258 videos from open-source websites under the CC0 license. All videos are of high quality without watermarks and All videos are of high quality without watermarks, and about 60% of them are landscape data. The total duration is about **274h 05m 13s**The main sources of data are divided into three parts:
1. [mixkit](https://mixkit.co/):The total number of videos we collected is **1234**, the total duration is about **6h 19m 32s**, and the total number of frames is **570815**. The resolution and aspect ratio distribution histogram of the video is as follows (the ones that account for less than 1% are not listed):
<img src="assets/v1.0.0_mixkit_resolution_plot.png" width="400" /> <img src="assets/v1.0.0_mixkit_aspect_ratio_plot.png" width="400" />
2. [pexels](https://www.pexels.com/zh-cn/):The total number of videos we collected is **7408** the total duration is about **48h 49m 24s** and the total number of frames is **5038641**. The resolution and aspect ratio distribution histogram of the video is as follows (the ones that account for less than 1% are not listed):
<img src="assets/v1.0.0_pexels_resolution_plot.png" height="300" /> <img src="assets/v1.0.0_pexels_aspect_ratio_plot.png" height="300" />
3. [pixabay](https://pixabay.com/):The total number of videos we collected is **31616** the total duration is about **218h 56m 17s** and the total number of frames is **23508970**. The resolution and aspect ratio distribution histogram of the video is as follows (the ones that account for less than 1% are not listed):
<img src="assets/v1.0.0_pixabay_resolution_plot.png" height="300" /> <img src="assets/v1.0.0_pixabay_aspect_ratio_plot.png" height="300" />
### Dense captions
it is challenging to directly crawl a large quantity of high-quality dense captions from the internet. Therefore, we utilize a mature Image-captioner model to obtain high-quality dense captions. We conducted ablation experiments on two multimodal large models: [ShareGPT4V-Captioner-7B](https://github.com/InternLM/InternLM-XComposer/blob/main/projects/ShareGPT4V/README.md) and [LLaVA-1.6-34B](https://github.com/haotian-liu/LLaVA). The former is specifically designed for caption generation, while the latter is a general-purpose multimodal large model. After conducting our ablation experiments, we found that they are comparable in performance. However, there is a significant difference in their inference speed on the A800 GPU: 40s/it of batch size of 12 for ShareGPT4V-Captioner-7B, 15s/it of batch size of 1 for LLaVA-1.6-34B. We open-source all annotations [here](https://huggingface.co/datasets/LanguageBind/Open-Sora-Plan-v1.0.0). We show some statistics here, and we set the maximum length of the model to 300, which covers almost 99% of the samples.
| Name | Avg length | Max | Std |
|---|---|---|---|
| ShareGPT4V-Captioner-7B | 170.0827524529121 | 467 | 53.689967539537776 |
| LLaVA-1.6-34B | 141.75851073472666 | 472 | 48.52492072346965 |
## Video split
### Video with transitions
Use [panda-70m](https://github.com/snap-research/Panda-70M/tree/main/splitting) to split transition video
### Video without transitions
1. Clone this repository and navigate to Open-Sora-Plan folder
```
git clone https://github.com/PKU-YuanGroup/Open-Sora-Plan
cd Open-Sora-Plan
```
2. Install the required packages
```
conda create -n opensora python=3.8 -y
conda activate opensora
pip install -e .
```
3. Split video script
```
git clone https://github.com/PKU-YuanGroup/Open-Sora-Dataset
python split/no_transition.py --video_json_file /path/to/your_video /path/to/save
```
If you want to know more, check out [Requirements and Installation](https://github.com/PKU-YuanGroup/Open-Sora-Plan?tab=readme-ov-file#%EF%B8%8F-requirements-and-installation)
## Acknowledgement 👍
Qingdao Weiyi Network Technology Co., Ltd.: Thank you very much for providing us with valuable data
|
AdapterOcean/med_alpaca_standardized_cluster_98 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 71594107
num_examples: 7118
download_size: 21529040
dataset_size: 71594107
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_98"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arieg/bw_spec_cls_4_06_s_200 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '574'
'1': '615'
'2': '620'
'3': '621'
splits:
- name: train
num_bytes: 42703982.0
num_examples: 800
- name: test
num_bytes: 1070833.0
num_examples: 20
download_size: 38425177
dataset_size: 43774815.0
---
# Dataset Card for "bw_spec_cls_4_06_s_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_num_v5_full_recite_full_passage_no_permute_rerun | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 8714584.788690874
num_examples: 4778
- name: validation
num_bytes: 580808
num_examples: 300
download_size: 1587540
dataset_size: 9295392.788690874
---
# Dataset Card for "squad_qa_num_v5_full_recite_full_passage_no_permute_rerun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carloswylker/AudiosBatista | ---
license: openrail
---
|
TesterSet/creepy | ---
license: openrail
---
|
Nexdata/British_English_Speech_Data_by_Mobile_Phone_Guiding | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/British_English_Speech_Data_by_Mobile_Phone_Guiding
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/81?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This data set contains 349 English speaker's speech data, all of whom are English locals. The recording environment is quiet. The recorded content includes many fields such as car, home, voice assistant, etc. About 50 sentences per person. Valid data is 9.5 hours. All texts are manually transcribed with high accuracy.
For more details, please refer to the link: https://www.nexdata.ai/datasets/81?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
British English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions |
sakleeee/1211221 | ---
license: creativeml-openrail-m
---
|
Aviral2412/Mini_project1_pretraining | ---
license: cc-by-nc-nd-3.0
---
|
zhouquan/first_datasets | ---
license: mit
---
|
Sangmun/wiki_doc_preprocessed | ---
license: other
---
|
fathyshalab/clinic-utility | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
- name: label_text
dtype: string
splits:
- name: train
num_bytes: 33764.5
num_examples: 525
- name: test
num_bytes: 14470.5
num_examples: 225
download_size: 0
dataset_size: 48235.0
---
# Dataset Card for "clinic-utility"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
```
@inproceedings{larson-etal-2019-evaluation,
title = "An Evaluation Dataset for Intent Classification and Out-of-Scope Prediction",
author = "Larson, Stefan and
Mahendran, Anish and
Peper, Joseph J. and
Clarke, Christopher and
Lee, Andrew and
Hill, Parker and
Kummerfeld, Jonathan K. and
Leach, Kevin and
Laurenzano, Michael A. and
Tang, Lingjia and
Mars, Jason",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)",
year = "2019",
url = "https://www.aclweb.org/anthology/D19-1131"
}
``` |
gwlms/dewiki-20230701 | ---
license: cc-by-sa-3.0
language:
- de
--- |
fathyshalab/reklamation24_medizin-gesundheit-pflege | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: label_name
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 218144
num_examples: 466
- name: test
num_bytes: 51557
num_examples: 117
download_size: 0
dataset_size: 269701
---
# Dataset Card for "reklamation24_medizin-gesundheit-pflege"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
iarbel/legal_eval | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: dataset
dtype: string
splits:
- name: train
num_bytes: 12307572
num_examples: 7589
- name: test
num_bytes: 12378874
num_examples: 6980
download_size: 12169603
dataset_size: 24686446
---
# Dataset Card for "legal_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ESPEKTRO/moisesgrave | ---
license: openrail
---
|
sh0416/mr | ---
task_categories:
- text-classification
language:
- en
---
# Movie Review Data
* Original source: sentence polarity dataset v1.0 http://www.cs.cornell.edu/people/pabo/movie-review-data/
* Seems to same as https://huggingface.co/datasets/rotten_tomatoes, but different split.
## Original README
=======
Introduction
This README v1.0 (June, 2005) for the v1.0 sentence polarity dataset comes
from the URL
http://www.cs.cornell.edu/people/pabo/movie-review-data .
=======
Citation Info
This data was first used in Bo Pang and Lillian Lee,
``Seeing stars: Exploiting class relationships for sentiment categorization
with respect to rating scales.'', Proceedings of the ACL, 2005.
@InProceedings{Pang+Lee:05a,
author = {Bo Pang and Lillian Lee},
title = {Seeing stars: Exploiting class relationships for sentiment
categorization with respect to rating scales},
booktitle = {Proceedings of the ACL},
year = 2005
}
=======
Data Format Summary
- rt-polaritydata.tar.gz: contains this readme and two data files that
were used in the experiments described in Pang/Lee ACL 2005.
Specifically:
* rt-polarity.pos contains 5331 positive snippets
* rt-polarity.neg contains 5331 negative snippets
Each line in these two files corresponds to a single snippet (usually
containing roughly one single sentence); all snippets are down-cased.
The snippets were labeled automatically, as described below (see
section "Label Decision").
Note: The original source files from which the data in
rt-polaritydata.tar.gz was derived can be found in the subjective
part (Rotten Tomatoes pages) of subjectivity_html.tar.gz (released
with subjectivity dataset v1.0).
=======
Label Decision
We assumed snippets (from Rotten Tomatoes webpages) for reviews marked with
``fresh'' are positive, and those for reviews marked with ``rotten'' are
negative.
## Preprocessing
To make csv with text and label field, we use the following script.
```python3
import csv
import random
# NOTE: The encoding of original file is "latin_1". We will change it to "utf8".
with open("rt-polarity.pos", encoding="latin_1") as f:
texts_pos = [line.strip() for line in f]
with open("rt-polarity.neg", encoding="latin_1") as f:
texts_neg = [line.strip() for line in f]
rows_pos = [{"text": text, "label": 1} for text in texts_pos]
rows_neg = [{"text": text, "label": 0} for text in texts_pos]
# NOTE: For fair validation, we split it into train and test. Also, for the research who wants to use different setting, we provide whole setting.
# NOTE: We follow the split setting in LM-BFF paper.
rows_whole = rows_pos + rows_neg
random.Random(42).shuffle(rows_whole)
rows_test, rows_train = rows_whole[:2000], rows_whole[2000:]
with open("whole.csv", "w", encoding="utf8") as f:
writer = csv.DictWriter(f, fieldnames=["text", "label"])
writer.writerows(rows_train)
with open("train.csv", "w", encoding="utf8") as f:
writer = csv.DictWriter(f, fieldnames=["text", "label"])
writer.writerows(rows_train)
with open("test.csv", "w", encoding="utf8") as f:
writer = csv.DictWriter(f, fieldnames=["text", "label"])
writer.writerows(rows_test)
```
|
bible-nlp/biblenlp-corpus | ---
annotations_creators:
- no-annotation
language_creators:
- expert-generated
language:
- aai
- aak
- aau
- aaz
- abt
- abx
- aby
- acf
- acr
- acu
- adz
- aer
- aey
- agd
- agg
- agm
- agn
- agr
- agt
- agu
- aia
- aii
- aka
- ake
- alp
- alq
- als
- aly
- ame
- amf
- amk
- amm
- amn
- amo
- amp
- amr
- amu
- amx
- anh
- anv
- aoi
- aoj
- aom
- aon
- apb
- ape
- apn
- apr
- apu
- apw
- apz
- arb
- are
- arl
- arn
- arp
- asm
- aso
- ata
- atb
- atd
- atg
- att
- auc
- aui
- auy
- avt
- awb
- awk
- awx
- azb
- azg
- azz
- bao
- bba
- bbb
- bbr
- bch
- bco
- bdd
- bea
- bef
- bel
- ben
- beo
- beu
- bgs
- bgt
- bhg
- bhl
- big
- bjk
- bjp
- bjr
- bjv
- bjz
- bkd
- bki
- bkq
- bkx
- bla
- blw
- blz
- bmh
- bmk
- bmr
- bmu
- bnp
- boa
- boj
- bon
- box
- bpr
- bps
- bqc
- bqp
- bre
- bsj
- bsn
- bsp
- bss
- buk
- bus
- bvd
- bvr
- bxh
- byr
- byx
- bzd
- bzh
- bzj
- caa
- cab
- cac
- caf
- cak
- cao
- cap
- car
- cav
- cax
- cbc
- cbi
- cbk
- cbr
- cbs
- cbt
- cbu
- cbv
- cco
- ceb
- cek
- ces
- cgc
- cha
- chd
- chf
- chk
- chq
- chz
- cjo
- cjv
- ckb
- cle
- clu
- cme
- cmn
- cni
- cnl
- cnt
- cof
- con
- cop
- cot
- cpa
- cpb
- cpc
- cpu
- cpy
- crn
- crx
- cso
- csy
- cta
- cth
- ctp
- ctu
- cub
- cuc
- cui
- cuk
- cut
- cux
- cwe
- cya
- daa
- dad
- dah
- dan
- ded
- deu
- dgc
- dgr
- dgz
- dhg
- dif
- dik
- dji
- djk
- djr
- dob
- dop
- dov
- dwr
- dww
- dwy
- ebk
- eko
- emi
- emp
- eng
- enq
- epo
- eri
- ese
- esk
- etr
- ewe
- faa
- fai
- far
- ffm
- for
- fra
- fue
- fuf
- fuh
- gah
- gai
- gam
- gaw
- gdn
- gdr
- geb
- gfk
- ghs
- glk
- gmv
- gng
- gnn
- gnw
- gof
- grc
- gub
- guh
- gui
- guj
- gul
- gum
- gun
- guo
- gup
- gux
- gvc
- gvf
- gvn
- gvs
- gwi
- gym
- gyr
- hat
- hau
- haw
- hbo
- hch
- heb
- heg
- hin
- hix
- hla
- hlt
- hmo
- hns
- hop
- hot
- hrv
- hto
- hub
- hui
- hun
- hus
- huu
- huv
- hvn
- ian
- ign
- ikk
- ikw
- ilo
- imo
- inb
- ind
- ino
- iou
- ipi
- isn
- ita
- iws
- ixl
- jac
- jae
- jao
- jic
- jid
- jiv
- jni
- jpn
- jvn
- kan
- kaq
- kbc
- kbh
- kbm
- kbq
- kdc
- kde
- kdl
- kek
- ken
- kew
- kgf
- kgk
- kgp
- khs
- khz
- kik
- kiw
- kiz
- kje
- kjn
- kjs
- kkc
- kkl
- klt
- klv
- kmg
- kmh
- kmk
- kmo
- kms
- kmu
- kne
- knf
- knj
- knv
- kos
- kpf
- kpg
- kpj
- kpr
- kpw
- kpx
- kqa
- kqc
- kqf
- kql
- kqw
- ksd
- ksj
- ksr
- ktm
- kto
- kud
- kue
- kup
- kvg
- kvn
- kwd
- kwf
- kwi
- kwj
- kyc
- kyf
- kyg
- kyq
- kyz
- kze
- lac
- lat
- lbb
- lbk
- lcm
- leu
- lex
- lgl
- lid
- lif
- lin
- lit
- llg
- lug
- luo
- lww
- maa
- maj
- mal
- mam
- maq
- mar
- mau
- mav
- maz
- mbb
- mbc
- mbh
- mbj
- mbl
- mbs
- mbt
- mca
- mcb
- mcd
- mcf
- mco
- mcp
- mcq
- mcr
- mdy
- med
- mee
- mek
- meq
- met
- meu
- mgc
- mgh
- mgw
- mhl
- mib
- mic
- mie
- mig
- mih
- mil
- mio
- mir
- mit
- miz
- mjc
- mkj
- mkl
- mkn
- mks
- mle
- mlh
- mlp
- mmo
- mmx
- mna
- mop
- mox
- mph
- mpj
- mpm
- mpp
- mps
- mpt
- mpx
- mqb
- mqj
- msb
- msc
- msk
- msm
- msy
- mti
- mto
- mux
- muy
- mva
- mvn
- mwc
- mwe
- mwf
- mwp
- mxb
- mxp
- mxq
- mxt
- mya
- myk
- myu
- myw
- myy
- mzz
- nab
- naf
- nak
- nas
- nay
- nbq
- nca
- nch
- ncj
- ncl
- ncu
- ndg
- ndj
- nfa
- ngp
- ngu
- nhe
- nhg
- nhi
- nho
- nhr
- nhu
- nhw
- nhy
- nif
- nii
- nin
- nko
- nld
- nlg
- nmw
- nna
- nnq
- noa
- nop
- not
- nou
- npi
- npl
- nsn
- nss
- ntj
- ntp
- ntu
- nuy
- nvm
- nwi
- nya
- nys
- nyu
- obo
- okv
- omw
- ong
- ons
- ood
- opm
- ory
- ote
- otm
- otn
- otq
- ots
- pab
- pad
- pah
- pan
- pao
- pes
- pib
- pio
- pir
- piu
- pjt
- pls
- plu
- pma
- poe
- poh
- poi
- pol
- pon
- por
- poy
- ppo
- prf
- pri
- ptp
- ptu
- pwg
- qub
- quc
- quf
- quh
- qul
- qup
- qvc
- qve
- qvh
- qvm
- qvn
- qvs
- qvw
- qvz
- qwh
- qxh
- qxn
- qxo
- rai
- reg
- rgu
- rkb
- rmc
- rmy
- ron
- roo
- rop
- row
- rro
- ruf
- rug
- rus
- rwo
- sab
- san
- sbe
- sbk
- sbs
- seh
- sey
- sgb
- sgz
- shj
- shp
- sim
- sja
- sll
- smk
- snc
- snn
- snp
- snx
- sny
- som
- soq
- soy
- spa
- spl
- spm
- spp
- sps
- spy
- sri
- srm
- srn
- srp
- srq
- ssd
- ssg
- ssx
- stp
- sua
- sue
- sus
- suz
- swe
- swh
- swp
- sxb
- tac
- taj
- tam
- tav
- taw
- tbc
- tbf
- tbg
- tbl
- tbo
- tbz
- tca
- tcs
- tcz
- tdt
- tee
- tel
- ter
- tet
- tew
- tfr
- tgk
- tgl
- tgo
- tgp
- tha
- thd
- tif
- tim
- tiw
- tiy
- tke
- tku
- tlf
- tmd
- tna
- tnc
- tnk
- tnn
- tnp
- toc
- tod
- tof
- toj
- ton
- too
- top
- tos
- tpa
- tpi
- tpt
- tpz
- trc
- tsw
- ttc
- tte
- tuc
- tue
- tuf
- tuo
- tur
- tvk
- twi
- txq
- txu
- tzj
- tzo
- ubr
- ubu
- udu
- uig
- ukr
- uli
- ulk
- upv
- ura
- urb
- urd
- uri
- urt
- urw
- usa
- usp
- uvh
- uvl
- vid
- vie
- viv
- vmy
- waj
- wal
- wap
- wat
- wbi
- wbp
- wed
- wer
- wim
- wiu
- wiv
- wmt
- wmw
- wnc
- wnu
- wol
- wos
- wrk
- wro
- wrs
- wsk
- wuv
- xav
- xbi
- xed
- xla
- xnn
- xon
- xsi
- xtd
- xtm
- yaa
- yad
- yal
- yap
- yaq
- yby
- ycn
- yka
- yle
- yml
- yon
- yor
- yrb
- yre
- yss
- yuj
- yut
- yuw
- yva
- zaa
- zab
- zac
- zad
- zai
- zaj
- zam
- zao
- zap
- zar
- zas
- zat
- zav
- zaw
- zca
- zga
- zia
- ziw
- zlm
- zos
- zpc
- zpl
- zpm
- zpo
- zpq
- zpu
- zpv
- zpz
- zsr
- ztq
- zty
- zyp
- be
- br
- cs
- ch
- zh
- de
- en
- eo
- fr
- ht
- he
- hr
- id
- it
- ja
- la
- nl
- ru
- sa
- so
- es
- sr
- sv
- to
- uk
- vi
license:
- cc-by-4.0
- other
multilinguality:
- translation
- multilingual
pretty_name: biblenlp-corpus
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- translation
task_ids: []
---
# Dataset Card for BibleNLP Corpus
### Dataset Summary
Partial and complete Bible translations in 833 languages, aligned by verse.
### Languages
aai, aak, aau, aaz, abt, abx, aby, acf, acr, acu, adz, aer, aey, agd, agg, agm, agn, agr, agt, agu, aia, aii, aka, ake, alp, alq, als, aly, ame, amf, amk, amm, amn, amo, amp, amr, amu, amx, anh, anv, aoi, aoj, aom, aon, apb, ape, apn, apr, apu, apw, apz, arb, are, arl, arn, arp, asm, aso, ata, atb, atd, atg, att, auc, aui, auy, avt, awb, awk, awx, azb, azg, azz, bao, bba, bbb, bbr, bch, bco, bdd, bea, bef, bel, ben, beo, beu, bgs, bgt, bhg, bhl, big, bjk, bjp, bjr, bjv, bjz, bkd, bki, bkq, bkx, bla, blw, blz, bmh, bmk, bmr, bmu, bnp, boa, boj, bon, box, bpr, bps, bqc, bqp, bre, bsj, bsn, bsp, bss, buk, bus, bvd, bvr, bxh, byr, byx, bzd, bzh, bzj, caa, cab, cac, caf, cak, cao, cap, car, cav, cax, cbc, cbi, cbk, cbr, cbs, cbt, cbu, cbv, cco, ceb, cek, ces, cgc, cha, chd, chf, chk, chq, chz, cjo, cjv, ckb, cle, clu, cme, cmn, cni, cnl, cnt, cof, con, cop, cot, cpa, cpb, cpc, cpu, cpy, crn, crx, cso, csy, cta, cth, ctp, ctu, cub, cuc, cui, cuk, cut, cux, cwe, cya, daa, dad, dah, dan, ded, deu, dgc, dgr, dgz, dhg, dif, dik, dji, djk, djr, dob, dop, dov, dwr, dww, dwy, ebk, eko, emi, emp, eng, enq, epo, eri, ese, esk, etr, ewe, faa, fai, far, ffm, for, fra, fue, fuf, fuh, gah, gai, gam, gaw, gdn, gdr, geb, gfk, ghs, glk, gmv, gng, gnn, gnw, gof, grc, gub, guh, gui, guj, gul, gum, gun, guo, gup, gux, gvc, gvf, gvn, gvs, gwi, gym, gyr, hat, hau, haw, hbo, hch, heb, heg, hin, hix, hla, hlt, hmo, hns, hop, hot, hrv, hto, hub, hui, hun, hus, huu, huv, hvn, ian, ign, ikk, ikw, ilo, imo, inb, ind, ino, iou, ipi, isn, ita, iws, ixl, jac, jae, jao, jic, jid, jiv, jni, jpn, jvn, kan, kaq, kbc, kbh, kbm, kbq, kdc, kde, kdl, kek, ken, kew, kgf, kgk, kgp, khs, khz, kik, kiw, kiz, kje, kjn, kjs, kkc, kkl, klt, klv, kmg, kmh, kmk, kmo, kms, kmu, kne, knf, knj, knv, kos, kpf, kpg, kpj, kpr, kpw, kpx, kqa, kqc, kqf, kql, kqw, ksd, ksj, ksr, ktm, kto, kud, kue, kup, kvg, kvn, kwd, kwf, kwi, kwj, kyc, kyf, kyg, kyq, kyz, kze, lac, lat, lbb, lbk, lcm, leu, lex, lgl, lid, lif, lin, lit, llg, lug, luo, lww, maa, maj, mal, mam, maq, mar, mau, mav, maz, mbb, mbc, mbh, mbj, mbl, mbs, mbt, mca, mcb, mcd, mcf, mco, mcp, mcq, mcr, mdy, med, mee, mek, meq, met, meu, mgc, mgh, mgw, mhl, mib, mic, mie, mig, mih, mil, mio, mir, mit, miz, mjc, mkj, mkl, mkn, mks, mle, mlh, mlp, mmo, mmx, mna, mop, mox, mph, mpj, mpm, mpp, mps, mpt, mpx, mqb, mqj, msb, msc, msk, msm, msy, mti, mto, mux, muy, mva, mvn, mwc, mwe, mwf, mwp, mxb, mxp, mxq, mxt, mya, myk, myu, myw, myy, mzz, nab, naf, nak, nas, nay, nbq, nca, nch, ncj, ncl, ncu, ndg, ndj, nfa, ngp, ngu, nhe, nhg, nhi, nho, nhr, nhu, nhw, nhy, nif, nii, nin, nko, nld, nlg, nmw, nna, nnq, noa, nop, not, nou, npi, npl, nsn, nss, ntj, ntp, ntu, nuy, nvm, nwi, nya, nys, nyu, obo, okv, omw, ong, ons, ood, opm, ory, ote, otm, otn, otq, ots, pab, pad, pah, pan, pao, pes, pib, pio, pir, piu, pjt, pls, plu, pma, poe, poh, poi, pol, pon, por, poy, ppo, prf, pri, ptp, ptu, pwg, qub, quc, quf, quh, qul, qup, qvc, qve, qvh, qvm, qvn, qvs, qvw, qvz, qwh, qxh, qxn, qxo, rai, reg, rgu, rkb, rmc, rmy, ron, roo, rop, row, rro, ruf, rug, rus, rwo, sab, san, sbe, sbk, sbs, seh, sey, sgb, sgz, shj, shp, sim, sja, sll, smk, snc, snn, snp, snx, sny, som, soq, soy, spa, spl, spm, spp, sps, spy, sri, srm, srn, srp, srq, ssd, ssg, ssx, stp, sua, sue, sus, suz, swe, swh, swp, sxb, tac, taj, tam, tav, taw, tbc, tbf, tbg, tbl, tbo, tbz, tca, tcs, tcz, tdt, tee, tel, ter, tet, tew, tfr, tgk, tgl, tgo, tgp, tha, thd, tif, tim, tiw, tiy, tke, tku, tlf, tmd, tna, tnc, tnk, tnn, tnp, toc, tod, tof, toj, ton, too, top, tos, tpa, tpi, tpt, tpz, trc, tsw, ttc, tte, tuc, tue, tuf, tuo, tur, tvk, twi, txq, txu, tzj, tzo, ubr, ubu, udu, uig, ukr, uli, ulk, upv, ura, urb, urd, uri, urt, urw, usa, usp, uvh, uvl, vid, vie, viv, vmy, waj, wal, wap, wat, wbi, wbp, wed, wer, wim, wiu, wiv, wmt, wmw, wnc, wnu, wol, wos, wrk, wro, wrs, wsk, wuv, xav, xbi, xed, xla, xnn, xon, xsi, xtd, xtm, yaa, yad, yal, yap, yaq, yby, ycn, yka, yle, yml, yon, yor, yrb, yre, yss, yuj, yut, yuw, yva, zaa, zab, zac, zad, zai, zaj, zam, zao, zap, zar, zas, zat, zav, zaw, zca, zga, zia, ziw, zlm, zos, zpc, zpl, zpm, zpo, zpq, zpu, zpv, zpz, zsr, ztq, zty, zyp
## Dataset Structure
### Data Fields
**translation**
- **languages** - an N length list of the languages of the translations, sorted alphabetically
- **translation** - an N length list with the translations each corresponding to the language specified in the above field
**files**
- **lang** - an N length list of the languages of the files, in order of input
- **file** - an N length list of the filenames from the corpus on github, each corresponding with the lang above
**ref** - the verse(s) contained in the record, as a list, with each represented with: ``<a three letter book code> <chapter number>:<verse number>``
**licenses** - an N length list of licenses, corresponding to the list of files above
**copyrights** - information on copyright holders, corresponding to the list of files above
### Usage
The dataset loading script requires installation of tqdm, ijson, and numpy
Specify the languages to be paired with a list and ISO 693-3 language codes, such as ``languages = ['eng', 'fra']``.
By default, the script will return individual verse pairs, as well as verses covering a full range. If only the individual verses is desired, use ``pair='single'``. If only the maximum range pairing is desired use ``pair='range'`` (for example, if one text uses the verse range covering GEN 1:1-3, all texts would return only the full length pairing).
## Sources
https://github.com/BibleNLP/ebible-corpus |
davanstrien/ia-loaded2 | ---
dataset_info:
features:
- name: crawl_date
dtype: int64
- name: last_modified_date
dtype: float64
- name: url
dtype: string
- name: filename
dtype: string
- name: extension
dtype: string
- name: mime_type_web_server
dtype: string
- name: mime_type_tika
dtype: string
- name: width
dtype: int64
- name: height
dtype: int64
- name: md5
dtype: string
- name: sha1
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 214200379.736
num_examples: 658
download_size: 0
dataset_size: 214200379.736
---
# Dataset Card for "ia-loaded2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tomekkorbak/pile-detoxify | ---
annotations_creators:
- machine-generated
language:
- en
language_creators:
- found
license:
- mit
multilinguality:
- monolingual
pretty_name: pile-detoxify
size_categories:
- 1M<n<10M
source_datasets:
- extended|the_pile
tags:
- toxicity
- pretraining-with-human-feedback
task_categories:
- text-classification
- other
task_ids:
- acceptability-classification
- hate-speech-detection
- text-scoring
---
# Dataset Card for pile-pii-scrubadub
## Dataset Description
- **Repository: https://github.com/tomekkorbak/aligned-pretraining-objectives**
- **Paper: Arxiv link to be added**
### Dataset Summary
This dataset contains text from [The Pile](https://huggingface.co/datasets/the_pile), annotated based on the toxicity of each sentence.
Each document (row in the dataset) is segmented into sentences, and each sentence is given a score: the toxicity predicted by the [Detoxify](https://github.com/unitaryai/detoxify).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
This dataset is taken from [The Pile](https://huggingface.co/datasets/the_pile), which is English text.
## Dataset Structure
### Data Instances
1949977
### Data Fields
- texts (sequence): a list of the sentences in the document, segmented using SpaCy
- meta (dict): the section of [The Pile](https://huggingface.co/datasets/the_pile) from which it originated
- scores (sequence): a score for each sentence in the `texts` column indicating the toxicity predicted by [Detoxify](https://github.com/unitaryai/detoxify)
- avg_score (float64): the average of the scores listed in the `scores` column
- num_sents (int64): the number of sentences (and scores) in that document
### Data Splits
Training set only
## Dataset Creation
### Curation Rationale
This is labeled text from [The Pile](https://huggingface.co/datasets/the_pile), a large dataset of text in English. The text is scored for toxicity so that generative language models can be trained to avoid generating toxic text.
### Source Data
#### Initial Data Collection and Normalization
This is labeled text from [The Pile](https://huggingface.co/datasets/the_pile).
#### Who are the source language producers?
Please see [The Pile](https://huggingface.co/datasets/the_pile) for the source of the dataset.
### Annotations
#### Annotation process
Each sentence was scored using [Detoxify](https://github.com/unitaryai/detoxify), which is a toxic comment classifier.
We used the `unbiased` model which is based on the 124M parameter [RoBERTa](https://arxiv.org/abs/1907.11692) and trained on the [Jigsaw Unintended Bias in Toxicity Classification dataset](https://www.kaggle.com/c/jigsaw-unintended-bias-in-toxicity-classification).
#### Who are the annotators?
[Detoxify](https://github.com/unitaryai/detoxify)
### Personal and Sensitive Information
This dataset contains all personal identifable information and toxic text that was originally contained in [The Pile](https://huggingface.co/datasets/the_pile).
## Considerations for Using the Data
### Social Impact of Dataset
This dataset contains examples of toxic text and personal identifiable information.
(A version of this datatset with personal identifiable information annotated is [available here](https://huggingface.co/datasets/tomekkorbak/pile-pii-scrubadub).)
Please take care to avoid misusing the toxic text or putting anybody in danger by publicizing their information.
This dataset is intended for research purposes only. We cannot guarantee that all toxic text has been detected, and we cannot guarantee that models trained using it will avoid generating toxic text.
We do not recommend deploying models trained on this data.
### Discussion of Biases
This dataset contains all biases from The Pile discussed in their paper: https://arxiv.org/abs/2101.00027
### Other Known Limitations
The toxic text in this dataset was detected using imperfect automated detection methods. We cannot guarantee that the labels are 100% accurate.
## Additional Information
### Dataset Curators
[The Pile](https://huggingface.co/datasets/the_pile)
### Licensing Information
From [The Pile](https://huggingface.co/datasets/the_pile): PubMed Central: [MIT License](https://github.com/EleutherAI/pile-pubmedcentral/blob/master/LICENSE)
### Citation Information
Paper information to be added
### Contributions
[The Pile](https://huggingface.co/datasets/the_pile) |
jonas/undp_jobs_raw | ---
license: wtfpl
---
|
Luckyroom/cyber-dataset | ---
license: llama2
---
|
BangumiBase/uruseiyatsura2022 | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Urusei Yatsura (2022)
This is the image base of bangumi Urusei Yatsura (2022), we detected 59 characters, 6234 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 244 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 69 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 25 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 468 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 1327 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 105 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 150 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 43 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 32 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 166 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 54 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 43 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 28 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 34 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 35 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 280 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 198 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 18 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 117 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 21 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 54 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 31 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 15 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 150 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 970 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 14 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 10 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 15 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 41 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 22 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 19 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 30 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 38 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 19 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 17 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 77 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 307 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 12 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 23 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 15 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 12 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 6 | [Download](41/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 42 | 8 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 21 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 28 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 92 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 48 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 94 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 10 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 16 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 63 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 12 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 20 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 13 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 41 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 9 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 223 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 8 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 174 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
RENREN6/lima-preference-dataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: better_response
dtype: string
- name: worse_response
dtype: string
splits:
- name: train
num_bytes: 1857133
num_examples: 200
download_size: 345058
dataset_size: 1857133
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
wefussell/amasum-app-df | ---
license: mit
---
|
CyberHarem/johnston_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of johnston (Kantai Collection)
This is the dataset of johnston (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `long_hair, two_side_up, light_brown_hair, brown_eyes, breasts, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 624.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/johnston_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 375.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/johnston_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1254 | 840.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/johnston_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 561.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/johnston_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1254 | 1.13 GiB | [Download](https://huggingface.co/datasets/CyberHarem/johnston_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/johnston_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, blue_one-piece_swimsuit, solo, cleavage, looking_at_viewer, hair_ribbon, jacket, casual_one-piece_swimsuit, choker, see-through, simple_background, white_background, cowboy_shot, ice_cream, official_alternate_costume, large_breasts |
| 1 | 9 |  |  |  |  |  | 1girl, blue_bikini, cleavage, navel, solo, looking_at_viewer, simple_background, white_background, choker, cowboy_shot, hair_ribbon, collarbone, black_gloves, blush, single_glove, twitter_username |
| 2 | 11 |  |  |  |  |  | 1girl, blue_bikini, cleavage, cowboy_shot, day, solo, blue_sky, cloud, looking_at_viewer, navel, choker, collarbone, outdoors, hair_ribbon, ocean, open_mouth, beach, black_gloves, blush, groin, ice_cream, single_glove |
| 3 | 36 |  |  |  |  |  | 1girl, black_skirt, blue_shirt, cleavage, off_shoulder, pleated_skirt, sailor_collar, serafuku, solo, looking_at_viewer, black_gloves, black_thighhighs, garter_straps, single_glove, simple_background, cowboy_shot, white_background |
| 4 | 7 |  |  |  |  |  | 1girl, adapted_turret, black_gloves, black_skirt, black_thighhighs, blue_shirt, cannon, cleavage, garter_straps, machinery, off_shoulder, pleated_skirt, rigging, sailor_collar, serafuku, shin_guards, smokestack, solo, simple_background, single_glove, full_body, looking_at_viewer, open_mouth, grey_background, standing, white_background |
| 5 | 5 |  |  |  |  |  | 1girl, blue_shirt, looking_at_viewer, off_shoulder, sailor_collar, serafuku, solo, upper_body, cleavage, simple_background, white_background, one-hour_drawing_challenge, smile, twitter_username, dated |
| 6 | 14 |  |  |  |  |  | 1girl, black_dress, halloween_costume, official_alternate_costume, solo, cleavage, garter_straps, black_thighhighs, large_breasts, black_gloves, open_mouth, cowboy_shot, single_glove, fang |
| 7 | 10 |  |  |  |  |  | 1girl, solo, white_shirt, official_alternate_costume, short_sleeves, simple_background, black_pantyhose, blue_skirt, looking_at_viewer, white_background, black_footwear, flower, smile, ascot, boots, full_body, ribbon |
| 8 | 13 |  |  |  |  |  | 1girl, playboy_bunny, solo, cleavage, fake_animal_ears, rabbit_ears, detached_collar, blue_leotard, looking_at_viewer, black_thighhighs, cowboy_shot, adapted_costume, strapless_leotard, rabbit_tail, alternate_costume, black_gloves, garter_straps, hand_on_hip, wrist_cuffs |
| 9 | 7 |  |  |  |  |  | 1girl, hetero, penis, solo_focus, vaginal, 1boy, bar_censor, nipples, pussy, navel, blush, large_breasts, open_mouth, thighhighs, clothed_sex, clothing_aside, cum, official_alternate_costume, on_back, sweat |
| 10 | 6 |  |  |  |  |  | 1girl, black_gloves, choker, solo, weapon, navel, simple_background, skirt, alternate_costume, midriff, tank_top, white_background, oni_horns, shirt, single_glove, thigh_strap, white_socks |
| 11 | 8 |  |  |  |  |  | 1girl, solo, white_apron, alternate_costume, frilled_apron, wa_maid, cowboy_shot, wide_sleeves, blue_kimono, hakama, holding, long_sleeves, maid_headdress, one-hour_drawing_challenge, thighhighs, black_skirt, blush, dated, floral_print, garter_straps, hair_between_eyes, looking_at_viewer, pink_kimono, simple_background, smile, tray, white_background |
| 12 | 12 |  |  |  |  |  | 1girl, solo, beret, blue_headwear, yellow_scarf, blue_coat, brown_skirt, blush, gift_box, heart-shaped_box, pleated_skirt, ribbed_sweater, white_sweater, fur-trimmed_coat, fur-trimmed_jacket, holding_gift, looking_at_viewer, star_(symbol), white_background, black_pantyhose, blue_jacket, long_sleeves, official_alternate_costume, simple_background, valentine |
| 13 | 9 |  |  |  |  |  | print_kimono, 1girl, floral_print, obi, pink_kimono, wide_sleeves, alternate_costume, hair_ornament, looking_at_viewer, solo, fur-trimmed_kimono, long_sleeves, blush, smile, choker, cowboy_shot, single_glove |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_one-piece_swimsuit | solo | cleavage | looking_at_viewer | hair_ribbon | jacket | casual_one-piece_swimsuit | choker | see-through | simple_background | white_background | cowboy_shot | ice_cream | official_alternate_costume | large_breasts | blue_bikini | navel | collarbone | black_gloves | blush | single_glove | twitter_username | day | blue_sky | cloud | outdoors | ocean | open_mouth | beach | groin | black_skirt | blue_shirt | off_shoulder | pleated_skirt | sailor_collar | serafuku | black_thighhighs | garter_straps | adapted_turret | cannon | machinery | rigging | shin_guards | smokestack | full_body | grey_background | standing | upper_body | one-hour_drawing_challenge | smile | dated | black_dress | halloween_costume | fang | white_shirt | short_sleeves | black_pantyhose | blue_skirt | black_footwear | flower | ascot | boots | ribbon | playboy_bunny | fake_animal_ears | rabbit_ears | detached_collar | blue_leotard | adapted_costume | strapless_leotard | rabbit_tail | alternate_costume | hand_on_hip | wrist_cuffs | hetero | penis | solo_focus | vaginal | 1boy | bar_censor | nipples | pussy | thighhighs | clothed_sex | clothing_aside | cum | on_back | sweat | weapon | skirt | midriff | tank_top | oni_horns | shirt | thigh_strap | white_socks | white_apron | frilled_apron | wa_maid | wide_sleeves | blue_kimono | hakama | holding | long_sleeves | maid_headdress | floral_print | hair_between_eyes | pink_kimono | tray | beret | blue_headwear | yellow_scarf | blue_coat | brown_skirt | gift_box | heart-shaped_box | ribbed_sweater | white_sweater | fur-trimmed_coat | fur-trimmed_jacket | holding_gift | star_(symbol) | blue_jacket | valentine | print_kimono | obi | hair_ornament | fur-trimmed_kimono |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------------------------|:-------|:-----------|:--------------------|:--------------|:---------|:----------------------------|:---------|:--------------|:--------------------|:-------------------|:--------------|:------------|:-----------------------------|:----------------|:--------------|:--------|:-------------|:---------------|:--------|:---------------|:-------------------|:------|:-----------|:--------|:-----------|:--------|:-------------|:--------|:--------|:--------------|:-------------|:---------------|:----------------|:----------------|:-----------|:-------------------|:----------------|:-----------------|:---------|:------------|:----------|:--------------|:-------------|:------------|:------------------|:-----------|:-------------|:-----------------------------|:--------|:--------|:--------------|:--------------------|:-------|:--------------|:----------------|:------------------|:-------------|:-----------------|:---------|:--------|:--------|:---------|:----------------|:-------------------|:--------------|:------------------|:---------------|:------------------|:--------------------|:--------------|:--------------------|:--------------|:--------------|:---------|:--------|:-------------|:----------|:-------|:-------------|:----------|:--------|:-------------|:--------------|:-----------------|:------|:----------|:--------|:---------|:--------|:----------|:-----------|:------------|:--------|:--------------|:--------------|:--------------|:----------------|:----------|:---------------|:--------------|:---------|:----------|:---------------|:-----------------|:---------------|:--------------------|:--------------|:-------|:--------|:----------------|:---------------|:------------|:--------------|:-----------|:-------------------|:-----------------|:----------------|:-------------------|:---------------------|:---------------|:----------------|:--------------|:------------|:---------------|:------|:----------------|:---------------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | | X | X | X | X | | | X | | X | X | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | | X | X | X | X | | | X | | | | X | X | | | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 36 |  |  |  |  |  | X | | X | X | X | | | | | | X | X | X | | | | | | | X | | X | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | X | X | X | | | | | | X | X | | | | | | | | X | | X | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | X | X | | | | | | X | X | | | | | | | | | | | X | | | | | | | | | | X | X | | X | X | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 14 |  |  |  |  |  | X | | X | X | | | | | | | | | X | | X | X | | | | X | | X | | | | | | | X | | | | | | | | | X | X | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 10 |  |  |  |  |  | X | | X | | X | | | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 13 |  |  |  |  |  | X | | X | X | X | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | | | | | | | | | | | | | | X | X | | X | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 6 |  |  |  |  |  | X | | X | | | | | | X | | X | X | | | | | | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 8 |  |  |  |  |  | X | | X | | X | | | | | | X | X | X | | | | | | | | X | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 12 | 12 |  |  |  |  |  | X | | X | | X | | | | | | X | X | | | X | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 13 | 9 |  |  |  |  |  | X | | X | | X | | | | X | | | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | X | | X | | | | | | | | | | | | | | | | | X | X | X | X |
|
ydqe2/kaggle_financial_sentiment_resplit | ---
license: mit
task_categories:
- text-classification
language:
- en
pretty_name: d
size_categories:
- 1K<n<10K
--- |
kaleemWaheed/twitter_dataset_1713080267 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 21461
num_examples: 52
download_size: 12851
dataset_size: 21461
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
maulinnasari/dataset_ext_20_mn | ---
dataset_info:
features:
- name: document
sequence: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 160065061
num_examples: 44972
- name: validation
num_bytes: 19636553
num_examples: 5622
- name: test
num_bytes: 19797897
num_examples: 5622
download_size: 124783985
dataset_size: 199499511
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
keremberke/chest-xray-classification | ---
task_categories:
- image-classification
tags:
- roboflow
- roboflow2huggingface
- Biology
---
<div align="center">
<img width="640" alt="keremberke/chest-xray-classification" src="https://huggingface.co/datasets/keremberke/chest-xray-classification/resolve/main/thumbnail.jpg">
</div>
### Dataset Labels
```
['NORMAL', 'PNEUMONIA']
```
### Number of Images
```json
{'train': 4077, 'test': 582, 'valid': 1165}
```
### How to Use
- Install [datasets](https://pypi.org/project/datasets/):
```bash
pip install datasets
```
- Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("keremberke/chest-xray-classification", name="full")
example = ds['train'][0]
```
### Roboflow Dataset Page
[https://universe.roboflow.com/mohamed-traore-2ekkp/chest-x-rays-qjmia/dataset/2](https://universe.roboflow.com/mohamed-traore-2ekkp/chest-x-rays-qjmia/dataset/2?ref=roboflow2huggingface)
### Citation
```
```
### License
CC BY 4.0
### Dataset Summary
This dataset was exported via roboflow.ai on March 31, 2022 at 3:11 PM GMT
It includes 5824 images.
Pneumonia are annotated in folder format.
The following pre-processing was applied to each image:
* Auto-orientation of pixel data (with EXIF-orientation stripping)
* Resize to 640x640 (Stretch)
No image augmentation techniques were applied.
|
mask-distilled-one-sec-cv12/chunk_172 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1064044688
num_examples: 208964
download_size: 1074878084
dataset_size: 1064044688
---
# Dataset Card for "chunk_172"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kheopss/humorous_tone_v2_dataset | ---
dataset_info:
features:
- name: assistant response
dtype: string
- name: response
dtype: string
- name: system
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 582966
num_examples: 114
download_size: 355139
dataset_size: 582966
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AppleHarem/downes_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of downes (Azur Lane)
This is the dataset of downes (Azur Lane), containing 15 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 15 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 41 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 43 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 15 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 15 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 15 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 41 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 41 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 32 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 43 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 43 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
Firminoleo/leilavoz | ---
license: openrail
---
|
modelloosrvcc/datasetexemplo | ---
license: openrail
---
|
fightfei/advices_llama2_2w | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7428021.0
num_examples: 19599
- name: test
num_bytes: 151979.0
num_examples: 401
download_size: 661329
dataset_size: 7580000.0
---
# Dataset Card for "advices_llama2_2w"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/ethlin_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ethlin (Fire Emblem)
This is the dataset of ethlin (Fire Emblem), containing 44 images and their tags.
The core tags of this character are `pink_hair, long_hair, pink_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 44 | 46.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ethlin_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 44 | 26.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ethlin_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 73 | 45.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ethlin_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 44 | 40.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ethlin_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 73 | 62.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ethlin_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ethlin_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, solo, bare_shoulders, smile, jewelry, looking_at_viewer, sidelocks, bangs, detached_collar, full_body, holding, long_dress, parted_lips, shiny_hair, strapless_dress, purple_footwear, standing, transparent_background, upper_body |
| 1 | 23 |  |  |  |  |  | 1girl, cape, solo, smile, staff, open_mouth, boots, holding |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | bare_shoulders | smile | jewelry | looking_at_viewer | sidelocks | bangs | detached_collar | full_body | holding | long_dress | parted_lips | shiny_hair | strapless_dress | purple_footwear | standing | transparent_background | upper_body | cape | staff | open_mouth | boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:--------|:----------|:--------------------|:------------|:--------|:------------------|:------------|:----------|:-------------|:--------------|:-------------|:------------------|:------------------|:-----------|:-------------------------|:-------------|:-------|:--------|:-------------|:--------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | |
| 1 | 23 |  |  |  |  |  | X | X | | X | | | | | | | X | | | | | | | | | X | X | X | X |
|
imperialwarrior/open-australian-legal-qa-paraphrased-easy-gemini | ---
dataset_info:
features:
- name: index
dtype: 'null'
- name: pipeline_1_result
dtype: string
- name: pipeline_1_result_embeddings
dtype: string
- name: pipeline_2_context
dtype: string
- name: pipeline_2_result
dtype: string
- name: pipeline_2_result_embeddings
dtype: string
- name: pipeline_3_context
dtype: string
- name: pipeline_3_result
dtype: string
- name: pipeline_3_result_embeddings
dtype: string
- name: pipeline_4_context
dtype: string
- name: pipeline_4_result
dtype: string
- name: pipeline_4_result_embeddings
dtype: string
- name: pipeline_5_context
dtype: string
- name: pipeline_5_result
dtype: string
- name: pipeline_5_result_embeddings
dtype: string
- name: pipeline_6_context
dtype: string
- name: pipeline_6_result
dtype: string
- name: pipeline_6_result_embeddings
dtype: string
- name: pipeline_7_context
dtype: string
- name: pipeline_7_result
dtype: string
- name: pipeline_7_result_embeddings
dtype: string
- name: referenced_question
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
- name: question_non_retrieval_embeddings
dtype: string
- name: answer_non_retrieval_embeddings
dtype: string
- name: question_retrieval_embeddings
dtype: string
- name: answer_retrieval_embeddings
dtype: string
- name: __index_level_0__
dtype: float64
- name: case_index
dtype: float64
- name: pipeline_6_case_indexes
sequence: int64
- name: pipeline_7_case_indexes
sequence: int64
splits:
- name: train
num_bytes: 41703799
num_examples: 207
download_size: 14322382
dataset_size: 41703799
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
neuclir/csl | ---
annotations_creators:
- no-annotation
language:
- zh
- en
license:
- apache-2.0
pretty_name: CSL
size_categories:
- 100K<n<1M
source_datasets:
- extended|csl
tags: []
task_categories:
- text-retrieval
task_ids:
- document-retrieval
---
# Dataset Card for CSL
## Dataset Description
CSL is the Chinese Scientific Literature Dataset.
- **Paper:** https://aclanthology.org/2022.coling-1.344
- **Repository:** https://github.com/ydli-ai/CSL
### Dataset Summary
The dataset contains titles, abstracts, keywords of papers written in Chinese from several academic fields.
### Languages
- Chinese
- English (translation)
## Dataset Structure
### Data Instances
| Split | Documents |
|-----------------|----------:|
| `csl` | 396k |
| `en_translation`| 396k |
### Data Fields
- `doc_id`: unique identifier for this document
- `title`: title of the paper
- `abstract`: abstract of the paper
- `keywords`: keywords associated with the paper
- `category`: the broad category of the paper
- `category_eng`: English translaction of the broad category (e.g., Engineering)
- `discipline`: academic discipline of the paper
- `discipline_eng`: English translation of the academic discipline (e.g., Agricultural Engineering)
The `en_translation` contains documents translated from Google Translation service.
All text are in English, so the fields `category_eng` and `discipline_eng` are omitted.
## Dataset Usage
Using 🤗 Datasets:
```python
from datasets import load_dataset
dataset = load_dataset('neuclir/csl')['csl']
```
## License & Citation
This dataset is based off the [Chinese Scientific Literature Dataset](https://github.com/ydli-ai/CSL) under Apache 2.0.
The primay change is the addition of `doc_id`s, English translactions of the category and discipline descriptions by a native speaker,
and basic de-duplication. Code that performed this modification is avalable in [this repository](https://github.com/NeuCLIR/csl-preprocess).
If you use this data, please cite:
```
@inproceedings{li-etal-2022-csl,
title = "{CSL}: A Large-scale {C}hinese Scientific Literature Dataset",
author = "Li, Yudong and
Zhang, Yuqing and
Zhao, Zhe and
Shen, Linlin and
Liu, Weijie and
Mao, Weiquan and
Zhang, Hui",
booktitle = "Proceedings of the 29th International Conference on Computational Linguistics",
month = oct,
year = "2022",
address = "Gyeongju, Republic of Korea",
publisher = "International Committee on Computational Linguistics",
url = "https://aclanthology.org/2022.coling-1.344",
pages = "3917--3923",
}
```
|
lingtrain/sanskrit-russian-short | ---
dataset_info:
features:
- name: ru
dtype: string
- name: san
dtype: string
splits:
- name: train
num_bytes: 15746614
num_examples: 36131
download_size: 8244708
dataset_size: 15746614
---
# Dataset Card for "sanskrit-russian-short"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Technoculture__Medtulu-2x7b | ---
pretty_name: Evaluation run of Technoculture/Medtulu-2x7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Technoculture/Medtulu-2x7b](https://huggingface.co/Technoculture/Medtulu-2x7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Technoculture__Medtulu-2x7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T08:08:44.091130](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Medtulu-2x7b/blob/main/results_2024-01-16T08-08-44.091130.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4912286252834545,\n\
\ \"acc_stderr\": 0.03450140674623141,\n \"acc_norm\": 0.4966099863528162,\n\
\ \"acc_norm_stderr\": 0.035271481019980566,\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.016629087514276775,\n \"mc2\": 0.500358139155482,\n\
\ \"mc2_stderr\": 0.015732799808200134\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5034129692832765,\n \"acc_stderr\": 0.014611050403244077,\n\
\ \"acc_norm\": 0.5460750853242321,\n \"acc_norm_stderr\": 0.014549221105171869\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.566122286397132,\n\
\ \"acc_stderr\": 0.004945956744943815,\n \"acc_norm\": 0.7568213503286197,\n\
\ \"acc_norm_stderr\": 0.004281253317507337\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.0404633688397825,\n\
\ \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.0404633688397825\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n\
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n\
\ \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.4861111111111111,\n\
\ \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.038118909889404105,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.038118909889404105\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992083,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992083\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5741935483870968,\n \"acc_stderr\": 0.028129112709165904,\n \"\
acc_norm\": 0.5741935483870968,\n \"acc_norm_stderr\": 0.028129112709165904\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649037,\n \"\
acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649037\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.03646204963253812,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.03646204963253812\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6414141414141414,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.6414141414141414,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.03292296639155142,\n\
\ \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.03292296639155142\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4846153846153846,\n \"acc_stderr\": 0.025339003010106515,\n\
\ \"acc_norm\": 0.4846153846153846,\n \"acc_norm_stderr\": 0.025339003010106515\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833713,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833713\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.41596638655462187,\n \"acc_stderr\": 0.03201650100739615,\n\
\ \"acc_norm\": 0.41596638655462187,\n \"acc_norm_stderr\": 0.03201650100739615\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6770642201834862,\n \"acc_stderr\": 0.020048115923415315,\n \"\
acc_norm\": 0.6770642201834862,\n \"acc_norm_stderr\": 0.020048115923415315\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.032568505702936464,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.032568505702936464\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6274509803921569,\n \"acc_stderr\": 0.03393388584958406,\n \"\
acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.03393388584958406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598028,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598028\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5336322869955157,\n\
\ \"acc_stderr\": 0.033481800170603065,\n \"acc_norm\": 0.5336322869955157,\n\
\ \"acc_norm_stderr\": 0.033481800170603065\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.04356447202665069,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.04356447202665069\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"\
acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112722,\n\
\ \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112722\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.04846748253977239,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.04846748253977239\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7435897435897436,\n\
\ \"acc_stderr\": 0.028605953702004257,\n \"acc_norm\": 0.7435897435897436,\n\
\ \"acc_norm_stderr\": 0.028605953702004257\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6475095785440613,\n\
\ \"acc_stderr\": 0.01708415024408138,\n \"acc_norm\": 0.6475095785440613,\n\
\ \"acc_norm_stderr\": 0.01708415024408138\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5404624277456648,\n \"acc_stderr\": 0.02683080599895224,\n\
\ \"acc_norm\": 0.5404624277456648,\n \"acc_norm_stderr\": 0.02683080599895224\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2547486033519553,\n\
\ \"acc_stderr\": 0.014572650383409155,\n \"acc_norm\": 0.2547486033519553,\n\
\ \"acc_norm_stderr\": 0.014572650383409155\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.028629916715693413,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.028629916715693413\n \
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\
\ \"acc_stderr\": 0.027882383791325953,\n \"acc_norm\": 0.594855305466238,\n\
\ \"acc_norm_stderr\": 0.027882383791325953\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.027744313443376536,\n\
\ \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.027744313443376536\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.02872386385328128,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.02872386385328128\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38070404172099087,\n\
\ \"acc_stderr\": 0.012401430654645898,\n \"acc_norm\": 0.38070404172099087,\n\
\ \"acc_norm_stderr\": 0.012401430654645898\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213514,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213514\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.43790849673202614,\n \"acc_stderr\": 0.020071257886886525,\n \
\ \"acc_norm\": 0.43790849673202614,\n \"acc_norm_stderr\": 0.020071257886886525\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n\
\ \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6616915422885572,\n\
\ \"acc_stderr\": 0.03345563070339191,\n \"acc_norm\": 0.6616915422885572,\n\
\ \"acc_norm_stderr\": 0.03345563070339191\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n\
\ \"acc_stderr\": 0.03819486140758398,\n \"acc_norm\": 0.4036144578313253,\n\
\ \"acc_norm_stderr\": 0.03819486140758398\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.016629087514276775,\n \"mc2\": 0.500358139155482,\n\
\ \"mc2_stderr\": 0.015732799808200134\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.728492501973165,\n \"acc_stderr\": 0.012499326254893129\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16982562547384383,\n \
\ \"acc_stderr\": 0.0103425723608612\n }\n}\n```"
repo_url: https://huggingface.co/Technoculture/Medtulu-2x7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|arc:challenge|25_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|gsm8k|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hellaswag|10_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T08-08-44.091130.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T08-08-44.091130.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- '**/details_harness|winogrande|5_2024-01-16T08-08-44.091130.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T08-08-44.091130.parquet'
- config_name: results
data_files:
- split: 2024_01_16T08_08_44.091130
path:
- results_2024-01-16T08-08-44.091130.parquet
- split: latest
path:
- results_2024-01-16T08-08-44.091130.parquet
---
# Dataset Card for Evaluation run of Technoculture/Medtulu-2x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Technoculture/Medtulu-2x7b](https://huggingface.co/Technoculture/Medtulu-2x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Technoculture__Medtulu-2x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T08:08:44.091130](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__Medtulu-2x7b/blob/main/results_2024-01-16T08-08-44.091130.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4912286252834545,
"acc_stderr": 0.03450140674623141,
"acc_norm": 0.4966099863528162,
"acc_norm_stderr": 0.035271481019980566,
"mc1": 0.34394124847001223,
"mc1_stderr": 0.016629087514276775,
"mc2": 0.500358139155482,
"mc2_stderr": 0.015732799808200134
},
"harness|arc:challenge|25": {
"acc": 0.5034129692832765,
"acc_stderr": 0.014611050403244077,
"acc_norm": 0.5460750853242321,
"acc_norm_stderr": 0.014549221105171869
},
"harness|hellaswag|10": {
"acc": 0.566122286397132,
"acc_stderr": 0.004945956744943815,
"acc_norm": 0.7568213503286197,
"acc_norm_stderr": 0.004281253317507337
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.0404633688397825,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.0404633688397825
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.04179596617581,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.04179596617581
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.038118909889404105,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.038118909889404105
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.023973861998992083,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.023973861998992083
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.042163702135578345,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.042163702135578345
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5741935483870968,
"acc_stderr": 0.028129112709165904,
"acc_norm": 0.5741935483870968,
"acc_norm_stderr": 0.028129112709165904
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.03476725747649037,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.03476725747649037
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.03646204963253812,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.03646204963253812
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6414141414141414,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.6414141414141414,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.03292296639155142,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.03292296639155142
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4846153846153846,
"acc_stderr": 0.025339003010106515,
"acc_norm": 0.4846153846153846,
"acc_norm_stderr": 0.025339003010106515
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833713,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833713
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.41596638655462187,
"acc_stderr": 0.03201650100739615,
"acc_norm": 0.41596638655462187,
"acc_norm_stderr": 0.03201650100739615
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6770642201834862,
"acc_stderr": 0.020048115923415315,
"acc_norm": 0.6770642201834862,
"acc_norm_stderr": 0.020048115923415315
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.032568505702936464,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.032568505702936464
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.03393388584958406,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.03393388584958406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598028,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598028
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5336322869955157,
"acc_stderr": 0.033481800170603065,
"acc_norm": 0.5336322869955157,
"acc_norm_stderr": 0.033481800170603065
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.04356447202665069,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.04356447202665069
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5705521472392638,
"acc_stderr": 0.03889066619112722,
"acc_norm": 0.5705521472392638,
"acc_norm_stderr": 0.03889066619112722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7435897435897436,
"acc_stderr": 0.028605953702004257,
"acc_norm": 0.7435897435897436,
"acc_norm_stderr": 0.028605953702004257
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6475095785440613,
"acc_stderr": 0.01708415024408138,
"acc_norm": 0.6475095785440613,
"acc_norm_stderr": 0.01708415024408138
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5404624277456648,
"acc_stderr": 0.02683080599895224,
"acc_norm": 0.5404624277456648,
"acc_norm_stderr": 0.02683080599895224
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2547486033519553,
"acc_stderr": 0.014572650383409155,
"acc_norm": 0.2547486033519553,
"acc_norm_stderr": 0.014572650383409155
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5,
"acc_stderr": 0.028629916715693413,
"acc_norm": 0.5,
"acc_norm_stderr": 0.028629916715693413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.027882383791325953,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.027882383791325953
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.027744313443376536,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.027744313443376536
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.02872386385328128,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.02872386385328128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38070404172099087,
"acc_stderr": 0.012401430654645898,
"acc_norm": 0.38070404172099087,
"acc_norm_stderr": 0.012401430654645898
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213514,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213514
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.43790849673202614,
"acc_stderr": 0.020071257886886525,
"acc_norm": 0.43790849673202614,
"acc_norm_stderr": 0.020071257886886525
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6616915422885572,
"acc_stderr": 0.03345563070339191,
"acc_norm": 0.6616915422885572,
"acc_norm_stderr": 0.03345563070339191
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.03819486140758398,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.03819486140758398
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34394124847001223,
"mc1_stderr": 0.016629087514276775,
"mc2": 0.500358139155482,
"mc2_stderr": 0.015732799808200134
},
"harness|winogrande|5": {
"acc": 0.728492501973165,
"acc_stderr": 0.012499326254893129
},
"harness|gsm8k|5": {
"acc": 0.16982562547384383,
"acc_stderr": 0.0103425723608612
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zicsx/mC4-Hindi-Cleaned | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 24677697357.760128
num_examples: 5251576
download_size: 9175340652
dataset_size: 24677697357.760128
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
language:
- hi
tags:
- mC4
size_categories:
- 10M<n<100M
---
# Dataset Card for "mC4-Hindi-Cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/durga_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of durga/ドゥルガー/杜尔伽 (Fate/Grand Order)
This is the dataset of durga/ドゥルガー/杜尔伽 (Fate/Grand Order), containing 114 images and their tags.
The core tags of this character are `breasts, long_hair, hair_ribbon, red_eyes, large_breasts, ribbon, earrings, very_long_hair, grey_hair, colored_skin, red_skin, gradient_skin, facial_mark, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 114 | 246.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/durga_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 114 | 207.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/durga_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 301 | 400.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/durga_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/durga_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, armlet, bare_shoulders, belly_chain, body_markings, bracelet, cleavage, collarbone, forehead_mark, looking_at_viewer, pelvic_curtain, revealing_clothes, sash, snake, solo, thighs, thumb_ring, open_mouth |
| 1 | 6 |  |  |  |  |  | 1girl, armlet, bare_shoulders, belly_chain, body_markings, bracelet, cleavage, looking_at_viewer, pelvic_curtain, revealing_clothes, sash, snake, solo, thighs, thumb_ring, navel |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | armlet | bare_shoulders | belly_chain | body_markings | bracelet | cleavage | collarbone | forehead_mark | looking_at_viewer | pelvic_curtain | revealing_clothes | sash | snake | solo | thighs | thumb_ring | open_mouth | navel |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-----------------|:--------------|:----------------|:-----------|:-----------|:-------------|:----------------|:--------------------|:-----------------|:--------------------|:-------|:--------|:-------|:---------|:-------------|:-------------|:--------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | | | X | X | X | X | X | X | X | X | | X |
|
sayan1101/finetune_run2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
struct:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1185515655
num_examples: 2585615
download_size: 667868561
dataset_size: 1185515655
---
# Dataset Card for "finetune_run2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vinnyyw/Anahivoice | ---
license: openrail
---
|
GreeneryScenery/SheepsNet | ---
tags:
- art
- SketchyCOCO
---
# V1
The images are from [SketchyCOCO](https://github.com/sysu-imsl/SketchyCOCO). 🤗
Things to improve:
- Better prompts
- More variety
- More sheeps |
ohtaman/aozora_kids | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: title
dtype: string
- name: author
dtype: string
- name: content
dtype: string
- name: filename
dtype: string
- name: category
dtype: string
- name: short_description
dtype: string
- name: char_kana_type
dtype: string
- name: story
dtype: string
splits:
- name: train
num_bytes: 85891851
num_examples: 1221
- name: test
num_bytes: 586251
num_examples: 8
download_size: 42922184
dataset_size: 86478102
---
# Dataset Card for "aozora_kids"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
loremipsum3658/pet | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: fname
dtype: string
- name: raw_text
dtype: string
- name: aviso_previo
dtype: bool
- name: saldo_de_salario
dtype: bool
- name: ferias
dtype: bool
- name: decimo_terceiro
dtype: bool
- name: fgts
dtype: bool
- name: multa_do_477
dtype: bool
- name: multa_do_467
dtype: bool
- name: horas_extras
dtype: bool
- name: intervalo_intrajornada
dtype: bool
- name: intervalo_interjornada
dtype: bool
- name: adicional_noturno
dtype: bool
- name: adicional_de_insalubridade
dtype: bool
- name: adicional_de_periculosidade
dtype: bool
- name: diferencas_salariais_ou_equiparacao_salarial
dtype: bool
- name: dano_moral
dtype: bool
- name: contribuicao_assistencial
dtype: bool
- name: indenizacao_por_lucros_cessantes
dtype: bool
- name: indenizacao_por_dano_emergente
dtype: bool
- name: multa_normativa
dtype: bool
- name: honorarios_advocaticios
dtype: bool
- name: justica_gratuita
dtype: bool
- name: reconhecimento_de_vinculo
dtype: bool
- name: reflexos_das_parcelas_salariais
dtype: bool
- name: reflexos_de_salarios_oficiosos_e_informais
dtype: bool
- name: outros
dtype: bool
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1654516
num_examples: 1705
- name: test
num_bytes: 351964
num_examples: 366
- name: validation
num_bytes: 332831
num_examples: 366
download_size: 1391885
dataset_size: 2339311
---
# Dataset Card for "pet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
prsdm/finance-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2093904
num_examples: 1000
download_size: 1215053
dataset_size: 2093904
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MauriceV2021/AuroraSDGsDataset | ---
license: cc-by-4.0
---
# Aurora SDGs Dataset
This data set contains metdata of 1.4 million research papers. The abstract plus the SDG labels for the Goals and Targets. |
martinvanaud/scenario-279-18012024 | ---
dataset_info:
features:
- name: Text
dtype: string
- name: Label
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 27546
num_examples: 223
- name: test
num_bytes: 5432
num_examples: 56
download_size: 24977
dataset_size: 32978
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
StivenLancheros/all_datasets_wikis | ---
dataset_info:
features:
- name: src_title
dtype: string
- name: tgt_title
dtype: string
- name: src_summary
dtype: string
- name: tgt_summary
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- name: text
dtype: string
- name: summary
dtype: string
- name: gem_id
dtype: string
- name: gem_parent_id
dtype: string
- name: id
dtype: string
- name: src_document
sequence:
- name: title
dtype: string
- name: section_level
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 6735593897
num_examples: 440000
download_size: 2531579730
dataset_size: 6735593897
---
# Dataset Card for "all_datasets_wikis"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
linhtran92/tts_male | ---
dataset_info:
features:
- name: sentence_norm
dtype: string
- name: audio
struct:
- name: array
sequence: int64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: wer
dtype: int64
- name: id
dtype: string
splits:
- name: train
num_bytes: 222336754
num_examples: 499
download_size: 45628084
dataset_size: 222336754
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "tts_male"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
james-burton/OrientalMuseum_min3-3Dwhite-num | ---
dataset_info:
features:
- name: label
dtype:
class_label:
names:
'0': DUROM.1950.10.a-b
'1': DUROM.1950.33.a-b
'2': DUROM.1952.1.21.b
'3': DUROM.1954.Spalding29.W
'4': DUROM.1954.Spalding32.a-j
'5': DUROM.1960.1012.a-b
'6': DUROM.1960.1215.a-b
'7': DUROM.1960.1276.a-b
'8': DUROM.1960.1359.a-b
'9': DUROM.1960.1489.b
'10': DUROM.1960.1784.a-b
'11': DUROM.1960.1885.c
'12': DUROM.1960.1908.a-b
'13': DUROM.1960.1951.a-b
'14': DUROM.1960.2068.a-b
'15': DUROM.1960.2224.a-b
'16': DUROM.1960.2255.a-c
'17': DUROM.1960.2349.a-b
'18': DUROM.1960.2395.A-B
'19': DUROM.1960.2448.a-b
'20': DUROM.1960.2456.b
'21': DUROM.1960.2566.a-b
'22': DUROM.1960.2645.A
'23': DUROM.1960.2996.a-b
'24': DUROM.1960.3070.a-b
'25': DUROM.1960.3200.h
'26': DUROM.1960.3253.a-b
'27': DUROM.1960.3295.A-B
'28': DUROM.1960.3400.a-b
'29': DUROM.1960.3449.a-b
'30': DUROM.1960.3573.a-b
'31': DUROM.1960.3685.a-b
'32': DUROM.1960.3969.a-b
'33': DUROM.1960.412.a-b
'34': DUROM.1960.589.a-b
'35': DUROM.1960.592.a-b
'36': DUROM.1960.827.a-b
'37': DUROM.1960.891.c
'38': DUROM.1960.945.c
'39': DUROM.1961.27.B
'40': DUROM.1961.31.B
'41': DUROM.1961.34.a-b
'42': DUROM.1961.40.a-b
'43': DUROM.1961.44.c
'44': DUROM.1962.210.B
'45': DUROM.1962.251.a
'46': DUROM.1962.253.f
'47': DUROM.1962.99.a-c
'48': DUROM.1963.50.b
'49': DUROM.1963.52.A
'50': DUROM.1964.12.F
'51': DUROM.1965.25.B
'52': DUROM.1966.2.A-B
'53': DUROM.1966.45.B
'54': DUROM.1966.46.B
'55': DUROM.1966.62.B
'56': DUROM.1967.23.k
'57': DUROM.1967.40.a-b
'58': DUROM.1968.139.b
'59': DUROM.1968.15.c.a-b
'60': DUROM.1968.178.a-b
'61': DUROM.1968.185.a-b
'62': DUROM.1968.43.d-e
'63': DUROM.1968.46.b
'64': DUROM.1968.64.a-b
'65': DUROM.1968.72.b
'66': DUROM.1968.79.a-b
'67': DUROM.1969.104.c
'68': DUROM.1969.144.B
'69': DUROM.1969.148.a-b
'70': DUROM.1969.150.b
'71': DUROM.1969.162.b
'72': DUROM.1969.166.b
'73': DUROM.1969.169.b
'74': DUROM.1969.171.b
'75': DUROM.1969.186.b
'76': DUROM.1969.188.b
'77': DUROM.1969.189.b
'78': DUROM.1969.190.b
'79': DUROM.1969.192.b
'80': DUROM.1969.194.a-b
'81': DUROM.1969.199.A
'82': DUROM.1969.20.c
'83': DUROM.1969.200.b
'84': DUROM.1969.204.b
'85': DUROM.1969.206.b
'86': DUROM.1969.21.a-b
'87': DUROM.1969.217.a-b
'88': DUROM.1969.218.b
'89': DUROM.1969.226.c
'90': DUROM.1969.232.B
'91': DUROM.1969.236.C
'92': DUROM.1969.238.b
'93': DUROM.1969.246.B
'94': DUROM.1969.247.b
'95': DUROM.1969.267.B
'96': DUROM.1969.29.B
'97': DUROM.1969.305.b
'98': DUROM.1969.313.b
'99': DUROM.1969.33.c
'100': DUROM.1969.344.c
'101': DUROM.1969.355.b
'102': DUROM.1969.367.a-b
'103': DUROM.1969.37.c
'104': DUROM.1969.374.A-B
'105': DUROM.1969.444.A
'106': DUROM.1969.46.b
'107': DUROM.1969.47.b
'108': DUROM.1969.480.B
'109': DUROM.1969.51.a-b
'110': DUROM.1969.540.B
'111': DUROM.1969.549.b
'112': DUROM.1969.568.p
'113': DUROM.1969.592.A
'114': DUROM.1969.614.a-b
'115': DUROM.1969.63.A-B
'116': DUROM.1969.679.a-b
'117': DUROM.1969.69.a-b
'118': DUROM.1969.77.c
'119': DUROM.1970.102.a-b
'120': DUROM.1970.108.B
'121': DUROM.1970.23.b
'122': DUROM.1970.8.B
'123': DUROM.1970.81.B
'124': DUROM.1971.17.e
'125': DUROM.1971.25.g
'126': DUROM.1971.28.B
'127': DUROM.1971.29.b
'128': DUROM.1971.31.l
'129': DUROM.1971.33.d
'130': DUROM.1971.48.b
'131': DUROM.1971.56.f
'132': DUROM.1971.60.A
'133': DUROM.1972.18.5.E
'134': DUROM.1972.19.d
'135': DUROM.1972.33.b
'136': DUROM.1973.12.b
'137': DUROM.1973.27.a-b
'138': DUROM.1973.28.b
'139': DUROM.1973.41.B
'140': DUROM.1974.12.d
'141': DUROM.1974.31.D
'142': DUROM.1974.5.A-B
'143': DUROM.1975.1.d
'144': DUROM.1975.9.w
'145': DUROM.1976.119.a-b
'146': DUROM.1976.153.c
'147': DUROM.1976.154.d
'148': DUROM.1976.156.g
'149': DUROM.1976.157.b
'150': DUROM.1976.176.e
'151': DUROM.1976.18.a-b
'152': DUROM.1976.19.B
'153': DUROM.1976.4.A-B
'154': DUROM.1976.41.F
'155': DUROM.1977.103.D
'156': DUROM.1977.89.b
'157': DUROM.1978.103.A-C
'158': DUROM.1978.138.m
'159': DUROM.1978.140.d
'160': DUROM.1978.15.G
'161': DUROM.1978.19.E
'162': DUROM.1979.38.a-b
'163': DUROM.1979.44.a-b
'164': DUROM.1979.48.a-b
'165': DUROM.1979.60.a-b
'166': DUROM.1979.8.b
'167': DUROM.1980.16.A-B
'168': DUROM.1980.18.b
'169': DUROM.1980.77.a-b
'170': DUROM.1981.12.A-B
'171': DUROM.1981.13.A-B
'172': DUROM.1981.14.a-b
'173': DUROM.1981.15.A-B
'174': DUROM.1981.16.a-b
'175': DUROM.1981.23.B
'176': DUROM.1981.27.b
'177': DUROM.1987.38.a
'178': DUROM.1987.40.B
'179': DUROM.1991.102.a-b
'180': DUROM.1991.135.a-b
'181': DUROM.1991.155.a-b
'182': DUROM.1991.30.H
'183': DUROM.1991.31.F
'184': DUROM.1991.83.a-b
'185': DUROM.1991.84.a-b
'186': DUROM.1992.105.a-b
'187': DUROM.1992.110.a-b
'188': DUROM.1992.12.a-b
'189': DUROM.1992.125.c
'190': DUROM.1992.13.B
'191': DUROM.1992.158.B
'192': DUROM.1992.69.a-b
'193': DUROM.1993.143.B
'194': DUROM.1993.145.F
'195': DUROM.1993.99.B
'196': DUROM.1994.4.a-c
'197': DUROM.1994.7.B
'198': DUROM.1994.8.c
'199': durma.1985.100.19
'200': durma.1985.100.62
'201': durma.1985.52.35
'202': durma.1985.52.37
'203': durma.1985.62.1
'204': durma.1985.64.121
'205': durma.1985.64.1243
'206': durma.1985.64.218
'207': durma.1985.64.219
'208': durma.1985.64.220
'209': durma.1985.64.221
'210': durma.1985.64.222
'211': durma.1985.64.223
'212': durma.1985.81.314.1
'213': durma.1985.81.4496
'214': durma.1985.9.1
'215': durma.1986.110.1
'216': durma.1986.126.1
'217': durma.1986.127.2
'218': durma.1986.151.1
'219': durma.1986.161.1
'220': durma.1986.173.1
'221': durma.1986.173.2
'222': durma.1989.44.1
'223': durma.1989.53.1
'224': durma.1989.54.1
'225': durma.2000.1.1
'226': durma.2000.4.1
'227': durma.2006.1.68
'228': durma.2014.2.1
'229': durma.2014.6.2
'230': durma.2017.13
'231': durma.2017.14
'232': durma.2017.19
'233': durma.2020.1.13.1
'234': durma.2020.1.2
'235': durma.2020.1.23
'236': durma.2020.1.40
'237': durma.2020.1.7
'238': durma.2020.1.8
'239': durma.2020.2
'240': durma.2020.3.1043
'241': durma.2020.3.2302
'242': durma.2020.3.2314
'243': durma.2020.3.2319
'244': durma.2020.3.41
'245': durma.2020.3.411
'246': durma.2020.3.412
'247': durma.2020.3.419
'248': durma.2020.3.52
'249': durma.2020.3.53
'250': durma.2020.3.56
'251': durma.2020.3.672
'252': durma.2020.3.723
'253': durma.2020.3.726
'254': durma.2020.3.729
'255': durma.2020.3.734
'256': durma.2020.3.744
'257': durma.2020.3.746
'258': durma.2020.3.765
'259': durma.2020.3.862
'260': durma.2020.3.915
'261': durma.2020.3.920
'262': durma.2020.3.921
'263': durom.1197
'264': durom.1921.840
'265': durom.1940.48
'266': durom.1949.1
'267': durom.1949.3
'268': durom.1949.5
'269': durom.1950.1
'270': durom.1950.3
'271': durom.1950.36
'272': durom.1950.4
'273': durom.1950.42
'274': durom.1950.48
'275': durom.1950.7
'276': durom.1951.1
'277': durom.1951.20
'278': durom.1951.4
'279': durom.1951.45
'280': durom.1951.52
'281': durom.1952.1.1
'282': durom.1952.1.10
'283': durom.1952.1.12
'284': durom.1952.1.13
'285': durom.1952.1.14
'286': durom.1952.1.15
'287': durom.1952.1.16
'288': durom.1952.1.17
'289': durom.1952.1.18
'290': durom.1952.1.20
'291': durom.1952.1.23
'292': durom.1952.1.24
'293': durom.1952.1.26
'294': durom.1952.1.28
'295': durom.1952.1.29
'296': durom.1952.1.3
'297': durom.1952.1.30
'298': durom.1952.1.31
'299': durom.1952.1.33
'300': durom.1952.1.36
'301': durom.1952.1.37
'302': durom.1952.1.38
'303': durom.1952.1.39
'304': durom.1952.1.4
'305': durom.1952.1.44
'306': durom.1952.1.45
'307': durom.1952.1.46
'308': durom.1952.1.47
'309': durom.1952.1.48
'310': durom.1952.1.49
'311': durom.1952.1.5
'312': durom.1952.1.50
'313': durom.1952.1.51
'314': durom.1952.1.52
'315': durom.1952.1.53
'316': durom.1952.1.54
'317': durom.1952.1.56
'318': durom.1952.1.57
'319': durom.1952.1.58
'320': durom.1952.1.59
'321': durom.1952.1.6
'322': durom.1952.1.60
'323': durom.1952.1.61
'324': durom.1952.1.62
'325': durom.1952.1.67
'326': durom.1952.1.68
'327': durom.1952.1.9
'328': durom.1952.2
'329': durom.1952.3
'330': durom.1952.34
'331': durom.1952.4
'332': durom.1952.9
'333': durom.1953.1
'334': durom.1953.10
'335': durom.1953.102
'336': durom.1953.121
'337': durom.1953.122
'338': durom.1953.171
'339': durom.1953.173
'340': durom.1953.176
'341': durom.1953.179
'342': durom.1953.183
'343': durom.1953.189
'344': durom.1953.204
'345': durom.1953.205
'346': durom.1953.206
'347': durom.1953.207
'348': durom.1953.209
'349': durom.1953.210
'350': durom.1953.211
'351': durom.1953.213
'352': durom.1953.214
'353': durom.1953.218
'354': durom.1953.220
'355': durom.1953.221
'356': durom.1953.222
'357': durom.1953.223
'358': durom.1953.224
'359': durom.1953.228
'360': durom.1953.23
'361': durom.1953.233
'362': durom.1953.234
'363': durom.1953.236
'364': durom.1953.237
'365': durom.1953.238
'366': durom.1953.242
'367': durom.1953.3
'368': durom.1953.36
'369': durom.1953.50
'370': durom.1953.54
'371': durom.1953.79
'372': durom.1953.84
'373': durom.1953.95
'374': durom.1953.98
'375': durom.1954.7
'376': durom.1954.spalding1
'377': durom.1954.spalding12
'378': durom.1954.spalding13
'379': durom.1954.spalding17
'380': durom.1954.spalding18
'381': durom.1954.spalding2
'382': durom.1954.spalding24
'383': durom.1954.spalding25
'384': durom.1954.spalding26
'385': durom.1954.spalding3
'386': durom.1954.spalding33
'387': durom.1954.spalding33a
'388': durom.1954.spalding33c
'389': durom.1954.spalding33e
'390': durom.1954.spalding33g
'391': durom.1954.spalding36
'392': durom.1954.spalding37
'393': durom.1954.spalding39
'394': durom.1954.spalding4
'395': durom.1954.spalding49
'396': durom.1954.spalding50
'397': durom.1954.spalding53
'398': durom.1954.spalding6
'399': durom.1954.spalding63
'400': durom.1954.spalding7
'401': durom.1954.spalding72
'402': durom.1954.spalding76
'403': durom.1954.spalding8
'404': durom.1955.lennard11
'405': durom.1955.lennard8
'406': durom.1955.lennard9
'407': durom.1956.keighley11
'408': durom.1956.keighley12
'409': durom.1956.keighley14
'410': durom.1956.yetts1
'411': durom.1956.yetts13
'412': durom.1956.yetts22
'413': durom.1956.yetts25
'414': durom.1956.yetts38
'415': durom.1956.yetts39
'416': durom.1956.yetts40.25
'417': durom.1956.yetts42
'418': durom.1960.1016
'419': durom.1960.1018
'420': durom.1960.1019
'421': durom.1960.1041
'422': durom.1960.1042
'423': durom.1960.1068
'424': durom.1960.1076
'425': durom.1960.1082
'426': durom.1960.1083
'427': durom.1960.1114
'428': durom.1960.1127
'429': durom.1960.1129
'430': durom.1960.1143
'431': durom.1960.1148
'432': durom.1960.1157
'433': durom.1960.1160
'434': durom.1960.1161
'435': durom.1960.1180
'436': durom.1960.1193
'437': durom.1960.1196
'438': durom.1960.1201
'439': durom.1960.1222
'440': durom.1960.1227
'441': durom.1960.1231
'442': durom.1960.1284
'443': durom.1960.1309
'444': durom.1960.1310
'445': durom.1960.1318
'446': durom.1960.1337
'447': durom.1960.1354
'448': durom.1960.1355
'449': durom.1960.1375
'450': durom.1960.1388
'451': durom.1960.1409
'452': durom.1960.1425
'453': durom.1960.1428
'454': durom.1960.1433
'455': durom.1960.1439
'456': durom.1960.1441
'457': durom.1960.1462
'458': durom.1960.1479
'459': durom.1960.1481
'460': durom.1960.1486
'461': durom.1960.1540
'462': durom.1960.1571
'463': durom.1960.1581
'464': durom.1960.1586
'465': durom.1960.1603
'466': durom.1960.1608
'467': durom.1960.1616
'468': durom.1960.1648
'469': durom.1960.1657
'470': durom.1960.1662
'471': durom.1960.1667
'472': durom.1960.1682
'473': durom.1960.1709
'474': durom.1960.1729
'475': durom.1960.1739
'476': durom.1960.1743
'477': durom.1960.1747
'478': durom.1960.1768
'479': durom.1960.1772
'480': durom.1960.1790
'481': durom.1960.1795
'482': durom.1960.1830
'483': durom.1960.1848
'484': durom.1960.1858
'485': durom.1960.1861
'486': durom.1960.1889
'487': durom.1960.1902
'488': durom.1960.1907
'489': durom.1960.1912
'490': durom.1960.1920
'491': durom.1960.1940
'492': durom.1960.1941
'493': durom.1960.1945
'494': durom.1960.1949
'495': durom.1960.1966
'496': durom.1960.1975
'497': durom.1960.2
'498': durom.1960.2062
'499': durom.1960.2074
'500': durom.1960.2091
'501': durom.1960.2092
'502': durom.1960.2098
'503': durom.1960.2115
'504': durom.1960.2118
'505': durom.1960.2120
'506': durom.1960.2138
'507': durom.1960.2142
'508': durom.1960.2146
'509': durom.1960.2151
'510': durom.1960.2160
'511': durom.1960.2189
'512': durom.1960.2190
'513': durom.1960.2205
'514': durom.1960.2206
'515': durom.1960.2211
'516': durom.1960.2225
'517': durom.1960.2232
'518': durom.1960.2233
'519': durom.1960.2237
'520': durom.1960.2238
'521': durom.1960.2260
'522': durom.1960.2278
'523': durom.1960.2295
'524': durom.1960.2303
'525': durom.1960.2307
'526': durom.1960.2320
'527': durom.1960.2323
'528': durom.1960.2332
'529': durom.1960.2345
'530': durom.1960.2358
'531': durom.1960.2364
'532': durom.1960.2391
'533': durom.1960.2408
'534': durom.1960.2430
'535': durom.1960.2431
'536': durom.1960.2447
'537': durom.1960.2449
'538': durom.1960.2453
'539': durom.1960.2463
'540': durom.1960.2469
'541': durom.1960.2495
'542': durom.1960.2502
'543': durom.1960.2506
'544': durom.1960.2508
'545': durom.1960.2512
'546': durom.1960.2518
'547': durom.1960.2521
'548': durom.1960.2542.1
'549': durom.1960.2542.2
'550': durom.1960.2552
'551': durom.1960.2562
'552': durom.1960.2563
'553': durom.1960.2564
'554': durom.1960.2565
'555': durom.1960.2592
'556': durom.1960.2595
'557': durom.1960.2599
'558': durom.1960.2615
'559': durom.1960.2620
'560': durom.1960.2622
'561': durom.1960.2630
'562': durom.1960.2631
'563': durom.1960.2677
'564': durom.1960.2687
'565': durom.1960.2689
'566': durom.1960.2691
'567': durom.1960.2692
'568': durom.1960.2704
'569': durom.1960.2713
'570': durom.1960.2724
'571': durom.1960.2758
'572': durom.1960.2765
'573': durom.1960.2782
'574': durom.1960.2798
'575': durom.1960.2805
'576': durom.1960.2806
'577': durom.1960.2824
'578': durom.1960.2830
'579': durom.1960.2844
'580': durom.1960.2846
'581': durom.1960.2861
'582': durom.1960.2863
'583': durom.1960.2868
'584': durom.1960.2981
'585': durom.1960.2991
'586': durom.1960.2998
'587': durom.1960.3009
'588': durom.1960.3021
'589': durom.1960.3050
'590': durom.1960.3061
'591': durom.1960.3062
'592': durom.1960.3063
'593': durom.1960.3072
'594': durom.1960.3099
'595': durom.1960.3116
'596': durom.1960.3118
'597': durom.1960.3122
'598': durom.1960.3151
'599': durom.1960.3177
'600': durom.1960.3183
'601': durom.1960.3198
'602': durom.1960.3202
'603': durom.1960.3239
'604': durom.1960.3242
'605': durom.1960.3265
'606': durom.1960.3271
'607': durom.1960.3272
'608': durom.1960.3297
'609': durom.1960.3303
'610': durom.1960.3307
'611': durom.1960.3325
'612': durom.1960.3335
'613': durom.1960.3340
'614': durom.1960.3357
'615': durom.1960.3360
'616': durom.1960.3373
'617': durom.1960.3391
'618': durom.1960.3402
'619': durom.1960.3409
'620': durom.1960.3418
'621': durom.1960.3421
'622': durom.1960.3425
'623': durom.1960.3428
'624': durom.1960.3430
'625': durom.1960.3432
'626': durom.1960.3436
'627': durom.1960.3437
'628': durom.1960.3440
'629': durom.1960.3458
'630': durom.1960.3459
'631': durom.1960.3462
'632': durom.1960.3470
'633': durom.1960.3503
'634': durom.1960.3511
'635': durom.1960.3523
'636': durom.1960.3524
'637': durom.1960.3526
'638': durom.1960.3528
'639': durom.1960.3535
'640': durom.1960.3538
'641': durom.1960.3542
'642': durom.1960.3553
'643': durom.1960.3554
'644': durom.1960.3567
'645': durom.1960.3574
'646': durom.1960.3581
'647': durom.1960.3582
'648': durom.1960.3591
'649': durom.1960.3594
'650': durom.1960.3595
'651': durom.1960.3601
'652': durom.1960.3611
'653': durom.1960.3618
'654': durom.1960.3624
'655': durom.1960.3631
'656': durom.1960.3636
'657': durom.1960.3654
'658': durom.1960.3659
'659': durom.1960.3664
'660': durom.1960.3683
'661': durom.1960.3690
'662': durom.1960.3693
'663': durom.1960.3694
'664': durom.1960.370
'665': durom.1960.3728
'666': durom.1960.3731
'667': durom.1960.3734
'668': durom.1960.3735
'669': durom.1960.3743
'670': durom.1960.3747
'671': durom.1960.3751
'672': durom.1960.3763
'673': durom.1960.3765
'674': durom.1960.3798
'675': durom.1960.3813
'676': durom.1960.3816
'677': durom.1960.3821
'678': durom.1960.3825
'679': durom.1960.3835
'680': durom.1960.3846
'681': durom.1960.3853
'682': durom.1960.3854
'683': durom.1960.3860
'684': durom.1960.3862
'685': durom.1960.3865
'686': durom.1960.3878
'687': durom.1960.3898
'688': durom.1960.3901
'689': durom.1960.3907
'690': durom.1960.3912
'691': durom.1960.3917
'692': durom.1960.3927
'693': durom.1960.3928
'694': durom.1960.3931
'695': durom.1960.3942
'696': durom.1960.3947
'697': durom.1960.3948
'698': durom.1960.3951
'699': durom.1960.3952
'700': durom.1960.3953
'701': durom.1960.3955
'702': durom.1960.3964
'703': durom.1960.3965
'704': durom.1960.3966
'705': durom.1960.3967
'706': durom.1960.3970
'707': durom.1960.3982
'708': durom.1960.3991
'709': durom.1960.4006
'710': durom.1960.4009
'711': durom.1960.4010
'712': durom.1960.4016
'713': durom.1960.4022
'714': durom.1960.4030
'715': durom.1960.4043
'716': durom.1960.4071
'717': durom.1960.4078
'718': durom.1960.4080
'719': durom.1960.4105
'720': durom.1960.4113
'721': durom.1960.4135
'722': durom.1960.4151
'723': durom.1960.4168
'724': durom.1960.4169
'725': durom.1960.4173
'726': durom.1960.4183
'727': durom.1960.4194
'728': durom.1960.4201
'729': durom.1960.4202
'730': durom.1960.4205
'731': durom.1960.4241
'732': durom.1960.518
'733': durom.1960.590
'734': durom.1960.6
'735': durom.1960.621
'736': durom.1960.627
'737': durom.1960.655
'738': durom.1960.811
'739': durom.1960.834
'740': durom.1960.837
'741': durom.1960.897
'742': durom.1960.902
'743': durom.1960.937
'744': durom.1960.955
'745': durom.1960.961
'746': durom.1960.962
'747': durom.1960.966
'748': durom.1960.971
'749': durom.1960.977
'750': durom.1960.992
'751': durom.1960.994
'752': durom.1960.cull9
'753': durom.1960.len-con2
'754': durom.1961.10
'755': durom.1961.11
'756': durom.1961.13
'757': durom.1961.14
'758': durom.1961.19
'759': durom.1961.2
'760': durom.1961.26
'761': durom.1961.28
'762': durom.1961.41
'763': durom.1961.42
'764': durom.1961.43
'765': durom.1961.46
'766': durom.1961.47
'767': durom.1961.6
'768': durom.1961.70
'769': durom.1961.71
'770': durom.1961.9
'771': durom.1962.114
'772': durom.1962.2
'773': durom.1962.201
'774': durom.1962.202
'775': durom.1962.205
'776': durom.1962.215
'777': durom.1962.217
'778': durom.1962.221
'779': durom.1962.222
'780': durom.1962.225
'781': durom.1962.226
'782': durom.1962.229
'783': durom.1962.230
'784': durom.1962.246
'785': durom.1962.247
'786': durom.1962.256
'787': durom.1962.258
'788': durom.1962.259
'789': durom.1962.32
'790': durom.1962.78
'791': durom.1962.81
'792': durom.1962.87
'793': durom.1962.88
'794': durom.1962.89
'795': durom.1962.92
'796': durom.1962.94
'797': durom.1962.95
'798': durom.1962.96
'799': durom.1963.1
'800': durom.1963.14
'801': durom.1963.16
'802': durom.1963.18
'803': durom.1963.19
'804': durom.1963.24
'805': durom.1963.25
'806': durom.1963.26
'807': durom.1963.27
'808': durom.1963.33
'809': durom.1963.38
'810': durom.1963.4
'811': durom.1963.43
'812': durom.1963.46
'813': durom.1963.47
'814': durom.1964.103
'815': durom.1964.105
'816': durom.1964.106
'817': durom.1964.110
'818': durom.1964.117
'819': durom.1964.118
'820': durom.1964.119
'821': durom.1964.122
'822': durom.1964.124
'823': durom.1964.125
'824': durom.1964.127
'825': durom.1964.128
'826': durom.1964.131
'827': durom.1964.133
'828': durom.1964.139
'829': durom.1964.149
'830': durom.1964.17
'831': durom.1964.18
'832': durom.1964.19
'833': durom.1964.20
'834': durom.1964.201
'835': durom.1964.205
'836': durom.1964.235
'837': durom.1964.236
'838': durom.1964.34
'839': durom.1964.340
'840': durom.1964.341
'841': durom.1964.342
'842': durom.1964.343
'843': durom.1964.344
'844': durom.1964.347
'845': durom.1964.349
'846': durom.1964.350
'847': durom.1964.351
'848': durom.1964.352
'849': durom.1964.353
'850': durom.1964.354
'851': durom.1964.355
'852': durom.1964.356
'853': durom.1964.357
'854': durom.1964.358
'855': durom.1964.359
'856': durom.1964.360
'857': durom.1964.361
'858': durom.1964.362
'859': durom.1964.363
'860': durom.1964.364
'861': durom.1964.365
'862': durom.1964.366
'863': durom.1964.367
'864': durom.1964.368
'865': durom.1964.370
'866': durom.1964.371
'867': durom.1964.372
'868': durom.1964.373
'869': durom.1964.374
'870': durom.1964.375
'871': durom.1964.376
'872': durom.1964.377
'873': durom.1964.378
'874': durom.1964.379
'875': durom.1964.380
'876': durom.1964.381
'877': durom.1964.382
'878': durom.1964.383
'879': durom.1964.384
'880': durom.1964.385
'881': durom.1964.386
'882': durom.1964.389
'883': durom.1964.390
'884': durom.1964.391
'885': durom.1964.392
'886': durom.1964.393
'887': durom.1964.394
'888': durom.1964.395
'889': durom.1964.396
'890': durom.1964.397
'891': durom.1964.398
'892': durom.1964.399
'893': durom.1964.4
'894': durom.1964.453
'895': durom.1964.454
'896': durom.1964.455
'897': durom.1964.456
'898': durom.1964.458
'899': durom.1964.48
'900': durom.1964.492
'901': durom.1964.493
'902': durom.1964.494
'903': durom.1964.495
'904': durom.1964.496
'905': durom.1964.497
'906': durom.1964.498
'907': durom.1964.499
'908': durom.1964.5.17
'909': durom.1964.5.18
'910': durom.1964.500
'911': durom.1964.501
'912': durom.1964.502
'913': durom.1964.503
'914': durom.1964.504
'915': durom.1964.505
'916': durom.1964.506
'917': durom.1964.507
'918': durom.1964.508
'919': durom.1964.509
'920': durom.1964.51
'921': durom.1964.510
'922': durom.1964.511
'923': durom.1964.512
'924': durom.1964.513
'925': durom.1964.514
'926': durom.1964.515
'927': durom.1964.516
'928': durom.1964.517
'929': durom.1964.518
'930': durom.1964.519
'931': durom.1964.520
'932': durom.1964.521
'933': durom.1964.522
'934': durom.1964.523
'935': durom.1964.524
'936': durom.1964.525
'937': durom.1964.526
'938': durom.1964.527
'939': durom.1964.528
'940': durom.1964.529
'941': durom.1964.530
'942': durom.1964.531
'943': durom.1964.532
'944': durom.1964.533
'945': durom.1964.534
'946': durom.1964.535
'947': durom.1964.536
'948': durom.1964.537
'949': durom.1964.538
'950': durom.1964.539
'951': durom.1964.541
'952': durom.1964.543
'953': durom.1964.544
'954': durom.1964.545
'955': durom.1964.546
'956': durom.1964.547
'957': durom.1964.553
'958': durom.1964.557
'959': durom.1964.583
'960': durom.1964.588
'961': durom.1964.60
'962': durom.1964.74
'963': durom.1964.75
'964': durom.1964.9
'965': durom.1964.90
'966': durom.1964.93
'967': durom.1964.96
'968': durom.1964.98
'969': durom.1965.1
'970': durom.1965.10
'971': durom.1965.12
'972': durom.1965.13
'973': durom.1965.21
'974': durom.1965.32
'975': durom.1965.33
'976': durom.1965.40
'977': durom.1965.46
'978': durom.1965.48
'979': durom.1965.49.3
'980': durom.1965.5
'981': durom.1965.50
'982': durom.1965.52
'983': durom.1965.53
'984': durom.1965.54
'985': durom.1965.56
'986': durom.1965.59
'987': durom.1965.60
'988': durom.1965.61
'989': durom.1965.62
'990': durom.1965.64
'991': durom.1965.67
'992': durom.1965.9
'993': durom.1966.11
'994': durom.1966.14
'995': durom.1966.3
'996': durom.1966.39
'997': durom.1966.44
'998': durom.1966.44.1
'999': durom.1966.5
'1000': durom.1966.56
'1001': durom.1966.6.1
'1002': durom.1966.6.2
'1003': durom.1966.6.3
'1004': durom.1966.6.4
'1005': durom.1966.8
'1006': durom.1967.1
'1007': durom.1967.16
'1008': durom.1967.18.19
'1009': durom.1967.18.2
'1010': durom.1967.18.23
'1011': durom.1967.26
'1012': durom.1967.28
'1013': durom.1967.31
'1014': durom.1967.35
'1015': durom.1968.103
'1016': durom.1968.106
'1017': durom.1968.107
'1018': durom.1968.11
'1019': durom.1968.111
'1020': durom.1968.114
'1021': durom.1968.121
'1022': durom.1968.132
'1023': durom.1968.134
'1024': durom.1968.147
'1025': durom.1968.148
'1026': durom.1968.163
'1027': durom.1968.170
'1028': durom.1968.172
'1029': durom.1968.173
'1030': durom.1968.175
'1031': durom.1968.176
'1032': durom.1968.182
'1033': durom.1968.184
'1034': durom.1968.20
'1035': durom.1968.27
'1036': durom.1968.29
'1037': durom.1968.30
'1038': durom.1968.31
'1039': durom.1968.34.11
'1040': durom.1968.34.19
'1041': durom.1968.34.2
'1042': durom.1968.34.3
'1043': durom.1968.34.4
'1044': durom.1968.34.6
'1045': durom.1968.38
'1046': durom.1968.40
'1047': durom.1968.41
'1048': durom.1968.42
'1049': durom.1968.44
'1050': durom.1968.49
'1051': durom.1968.56.2
'1052': durom.1968.56.34
'1053': durom.1968.56.35
'1054': durom.1968.56.36
'1055': durom.1968.56.37
'1056': durom.1968.56.38
'1057': durom.1968.6
'1058': durom.1968.66
'1059': durom.1968.68
'1060': durom.1968.70
'1061': durom.1968.74
'1062': durom.1968.75
'1063': durom.1968.8
'1064': durom.1968.83
'1065': durom.1968.85
'1066': durom.1968.87
'1067': durom.1968.90
'1068': durom.1968.92
'1069': durom.1968.94
'1070': durom.1968.95
'1071': durom.1968.96
'1072': durom.1968.98
'1073': durom.1968.99
'1074': durom.1969.1
'1075': durom.1969.100
'1076': durom.1969.101
'1077': durom.1969.103
'1078': durom.1969.106
'1079': durom.1969.107
'1080': durom.1969.108
'1081': durom.1969.110
'1082': durom.1969.116
'1083': durom.1969.117
'1084': durom.1969.118
'1085': durom.1969.119
'1086': durom.1969.122
'1087': durom.1969.123
'1088': durom.1969.126
'1089': durom.1969.127
'1090': durom.1969.129
'1091': durom.1969.134
'1092': durom.1969.136
'1093': durom.1969.138
'1094': durom.1969.140
'1095': durom.1969.141
'1096': durom.1969.142
'1097': durom.1969.143
'1098': durom.1969.145
'1099': durom.1969.147
'1100': durom.1969.15
'1101': durom.1969.151
'1102': durom.1969.152
'1103': durom.1969.154
'1104': durom.1969.157
'1105': durom.1969.165
'1106': durom.1969.167
'1107': durom.1969.174
'1108': durom.1969.177
'1109': durom.1969.18
'1110': durom.1969.180
'1111': durom.1969.19
'1112': durom.1969.2
'1113': durom.1969.201
'1114': durom.1969.203
'1115': durom.1969.207
'1116': durom.1969.209
'1117': durom.1969.211
'1118': durom.1969.214
'1119': durom.1969.22
'1120': durom.1969.220
'1121': durom.1969.221
'1122': durom.1969.222
'1123': durom.1969.225
'1124': durom.1969.23
'1125': durom.1969.230
'1126': durom.1969.231
'1127': durom.1969.233
'1128': durom.1969.234
'1129': durom.1969.24
'1130': durom.1969.241
'1131': durom.1969.243
'1132': durom.1969.251
'1133': durom.1969.254
'1134': durom.1969.258
'1135': durom.1969.266
'1136': durom.1969.270
'1137': durom.1969.271
'1138': durom.1969.273
'1139': durom.1969.277
'1140': durom.1969.288
'1141': durom.1969.289
'1142': durom.1969.30
'1143': durom.1969.300
'1144': durom.1969.303
'1145': durom.1969.304
'1146': durom.1969.307
'1147': durom.1969.308
'1148': durom.1969.309
'1149': durom.1969.31
'1150': durom.1969.310
'1151': durom.1969.311
'1152': durom.1969.315
'1153': durom.1969.316
'1154': durom.1969.317
'1155': durom.1969.32
'1156': durom.1969.321
'1157': durom.1969.322
'1158': durom.1969.323
'1159': durom.1969.324
'1160': durom.1969.329
'1161': durom.1969.330
'1162': durom.1969.333
'1163': durom.1969.334
'1164': durom.1969.335
'1165': durom.1969.34
'1166': durom.1969.340
'1167': durom.1969.341
'1168': durom.1969.345
'1169': durom.1969.346
'1170': durom.1969.347
'1171': durom.1969.349
'1172': durom.1969.35
'1173': durom.1969.350
'1174': durom.1969.353
'1175': durom.1969.356
'1176': durom.1969.358
'1177': durom.1969.362
'1178': durom.1969.365
'1179': durom.1969.371
'1180': durom.1969.372
'1181': durom.1969.375
'1182': durom.1969.376
'1183': durom.1969.377
'1184': durom.1969.379.1-3
'1185': durom.1969.382
'1186': durom.1969.384
'1187': durom.1969.39
'1188': durom.1969.391
'1189': durom.1969.399
'1190': durom.1969.4
'1191': durom.1969.40
'1192': durom.1969.406
'1193': durom.1969.408
'1194': durom.1969.41
'1195': durom.1969.411
'1196': durom.1969.413
'1197': durom.1969.417
'1198': durom.1969.419
'1199': durom.1969.421
'1200': durom.1969.424
'1201': durom.1969.426
'1202': durom.1969.429
'1203': durom.1969.43
'1204': durom.1969.433
'1205': durom.1969.434
'1206': durom.1969.449
'1207': durom.1969.453
'1208': durom.1969.460
'1209': durom.1969.461
'1210': durom.1969.474
'1211': durom.1969.475
'1212': durom.1969.476
'1213': durom.1969.48
'1214': durom.1969.482
'1215': durom.1969.483
'1216': durom.1969.487
'1217': durom.1969.491
'1218': durom.1969.493
'1219': durom.1969.494
'1220': durom.1969.495.12
'1221': durom.1969.496
'1222': durom.1969.497
'1223': durom.1969.5
'1224': durom.1969.502
'1225': durom.1969.503
'1226': durom.1969.513
'1227': durom.1969.517
'1228': durom.1969.518
'1229': durom.1969.534
'1230': durom.1969.536
'1231': durom.1969.537
'1232': durom.1969.538
'1233': durom.1969.54
'1234': durom.1969.542
'1235': durom.1969.544
'1236': durom.1969.547
'1237': durom.1969.550
'1238': durom.1969.551
'1239': durom.1969.560
'1240': durom.1969.565
'1241': durom.1969.566
'1242': durom.1969.569
'1243': durom.1969.571
'1244': durom.1969.573
'1245': durom.1969.577
'1246': durom.1969.578
'1247': durom.1969.58
'1248': durom.1969.582
'1249': durom.1969.584
'1250': durom.1969.593
'1251': durom.1969.594
'1252': durom.1969.6
'1253': durom.1969.605
'1254': durom.1969.607
'1255': durom.1969.61
'1256': durom.1969.610
'1257': durom.1969.615
'1258': durom.1969.616
'1259': durom.1969.629
'1260': durom.1969.634
'1261': durom.1969.639
'1262': durom.1969.64
'1263': durom.1969.644
'1264': durom.1969.66
'1265': durom.1969.667
'1266': durom.1969.67
'1267': durom.1969.670
'1268': durom.1969.674
'1269': durom.1969.678
'1270': durom.1969.68
'1271': durom.1969.687
'1272': durom.1969.688.4
'1273': durom.1969.689
'1274': durom.1969.693
'1275': durom.1969.695
'1276': durom.1969.7
'1277': durom.1969.70
'1278': durom.1969.704
'1279': durom.1969.707
'1280': durom.1969.72
'1281': durom.1969.73
'1282': durom.1969.78
'1283': durom.1969.8
'1284': durom.1969.80
'1285': durom.1969.81
'1286': durom.1969.85
'1287': durom.1969.87
'1288': durom.1969.88
'1289': durom.1969.89
'1290': durom.1969.9
'1291': durom.1969.91
'1292': durom.1969.95
'1293': durom.1969.96
'1294': durom.1969.97
'1295': durom.1969.98
'1296': durom.1970.105
'1297': durom.1970.106.1
'1298': durom.1970.106.14
'1299': durom.1970.106.2
'1300': durom.1970.106.3
'1301': durom.1970.106.4
'1302': durom.1970.106.5
'1303': durom.1970.106.6
'1304': durom.1970.106.7
'1305': durom.1970.17
'1306': durom.1970.18
'1307': durom.1970.22
'1308': durom.1970.25
'1309': durom.1970.26
'1310': durom.1970.3
'1311': durom.1970.4
'1312': durom.1970.48
'1313': durom.1970.50.19
'1314': durom.1970.55
'1315': durom.1970.63
'1316': durom.1970.64
'1317': durom.1970.65
'1318': durom.1970.67
'1319': durom.1970.70
'1320': durom.1970.73
'1321': durom.1970.75
'1322': durom.1970.76
'1323': durom.1970.82
'1324': durom.1970.91
'1325': durom.1970.92
'1326': durom.1970.97
'1327': durom.1970.99
'1328': durom.1971.100
'1329': durom.1971.103
'1330': durom.1971.105
'1331': durom.1971.114
'1332': durom.1971.121
'1333': durom.1971.122
'1334': durom.1971.134
'1335': durom.1971.145
'1336': durom.1971.16
'1337': durom.1971.170
'1338': durom.1971.181
'1339': durom.1971.184
'1340': durom.1971.190
'1341': durom.1971.195
'1342': durom.1971.2
'1343': durom.1971.200
'1344': durom.1971.201
'1345': durom.1971.202
'1346': durom.1971.21
'1347': durom.1971.219
'1348': durom.1971.223
'1349': durom.1971.229
'1350': durom.1971.232
'1351': durom.1971.248
'1352': durom.1971.26
'1353': durom.1971.27
'1354': durom.1971.30
'1355': durom.1971.34
'1356': durom.1971.34.1
'1357': durom.1971.34.3
'1358': durom.1971.35
'1359': durom.1971.38
'1360': durom.1971.39
'1361': durom.1971.4
'1362': durom.1971.49
'1363': durom.1971.52
'1364': durom.1971.53
'1365': durom.1971.76
'1366': durom.1971.scott2
'1367': durom.1972.10
'1368': durom.1972.17
'1369': durom.1972.26
'1370': durom.1972.40
'1371': durom.1972.49
'1372': durom.1972.56
'1373': durom.1972.58
'1374': durom.1972.59
'1375': durom.1972.62
'1376': durom.1972.8
'1377': durom.1973.1
'1378': durom.1973.10
'1379': durom.1973.13
'1380': durom.1973.17
'1381': durom.1973.2
'1382': durom.1973.20
'1383': durom.1973.22c
'1384': durom.1973.35
'1385': durom.1973.36
'1386': durom.1973.38
'1387': durom.1973.47
'1388': durom.1973.48
'1389': durom.1973.52
'1390': durom.1974.1
'1391': durom.1974.11
'1392': durom.1974.13
'1393': durom.1974.131
'1394': durom.1974.132
'1395': durom.1974.133
'1396': durom.1974.136
'1397': durom.1974.138
'1398': durom.1974.147
'1399': durom.1974.20
'1400': durom.1974.22
'1401': durom.1974.30.3
'1402': durom.1974.37
'1403': durom.1974.39
'1404': durom.1974.42
'1405': durom.1974.43
'1406': durom.1974.53
'1407': durom.1974.54
'1408': durom.1974.57
'1409': durom.1974.58
'1410': durom.1974.59
'1411': durom.1974.60
'1412': durom.1974.67
'1413': durom.1974.7
'1414': durom.1974.8
'1415': durom.1974.9
'1416': durom.1975.10
'1417': durom.1975.11
'1418': durom.1975.20
'1419': durom.1975.25
'1420': durom.1975.43
'1421': durom.1975.46
'1422': durom.1975.50
'1423': durom.1975.52
'1424': durom.1975.53
'1425': durom.1975.54
'1426': durom.1975.55
'1427': durom.1975.56
'1428': durom.1975.57
'1429': durom.1975.59
'1430': durom.1975.60
'1431': durom.1975.8
'1432': durom.1976.110
'1433': durom.1976.116
'1434': durom.1976.117
'1435': durom.1976.118
'1436': durom.1976.12
'1437': durom.1976.120
'1438': durom.1976.124
'1439': durom.1976.125
'1440': durom.1976.126
'1441': durom.1976.130
'1442': durom.1976.138
'1443': durom.1976.139
'1444': durom.1976.14
'1445': durom.1976.142
'1446': durom.1976.144
'1447': durom.1976.148
'1448': durom.1976.15
'1449': durom.1976.158
'1450': durom.1976.17
'1451': durom.1976.187
'1452': durom.1976.188
'1453': durom.1976.189
'1454': durom.1976.191
'1455': durom.1976.193
'1456': durom.1976.194
'1457': durom.1976.195
'1458': durom.1976.196
'1459': durom.1976.197
'1460': durom.1976.199
'1461': durom.1976.2
'1462': durom.1976.200
'1463': durom.1976.201
'1464': durom.1976.202
'1465': durom.1976.203
'1466': durom.1976.205
'1467': durom.1976.206
'1468': durom.1976.207
'1469': durom.1976.208
'1470': durom.1976.21
'1471': durom.1976.23
'1472': durom.1976.274
'1473': durom.1976.29
'1474': durom.1976.290
'1475': durom.1976.293
'1476': durom.1976.34
'1477': durom.1976.36
'1478': durom.1977.104
'1479': durom.1977.25
'1480': durom.1977.45
'1481': durom.1977.50
'1482': durom.1977.61
'1483': durom.1977.64
'1484': durom.1977.68
'1485': durom.1977.71
'1486': durom.1977.72
'1487': durom.1977.76
'1488': durom.1977.82
'1489': durom.1977.85
'1490': durom.1978.10
'1491': durom.1978.100
'1492': durom.1978.101
'1493': durom.1978.104
'1494': durom.1978.110
'1495': durom.1978.112
'1496': durom.1978.115
'1497': durom.1978.128
'1498': durom.1978.147
'1499': durom.1978.22
'1500': durom.1978.23
'1501': durom.1978.3
'1502': durom.1978.35
'1503': durom.1978.38
'1504': durom.1978.39
'1505': durom.1978.4
'1506': durom.1978.40
'1507': durom.1978.43
'1508': durom.1978.45
'1509': durom.1978.46
'1510': durom.1978.48
'1511': durom.1978.54
'1512': durom.1978.55
'1513': durom.1978.65
'1514': durom.1978.82
'1515': durom.1978.87
'1516': durom.1978.88
'1517': durom.1978.9
'1518': durom.1978.91
'1519': durom.1978.93
'1520': durom.1978.99
'1521': durom.1979.12.12
'1522': durom.1979.12.13
'1523': durom.1979.12.14
'1524': durom.1979.12.15
'1525': durom.1979.12.16
'1526': durom.1979.12.17
'1527': durom.1979.12.18
'1528': durom.1979.12.19
'1529': durom.1979.12.20
'1530': durom.1979.12.5
'1531': durom.1979.12.8
'1532': durom.1979.12.9
'1533': durom.1979.16.19
'1534': durom.1979.24
'1535': durom.1979.27
'1536': durom.1979.32
'1537': durom.1979.34
'1538': durom.1979.35
'1539': durom.1979.37
'1540': durom.1979.42
'1541': durom.1979.43
'1542': durom.1979.47
'1543': durom.1979.5
'1544': durom.1979.52
'1545': durom.1979.53
'1546': durom.1979.62
'1547': durom.1979.63.1
'1548': durom.1979.7
'1549': durom.1979.75
'1550': durom.1979.79
'1551': durom.1979.80
'1552': durom.1979.81
'1553': durom.1979.82
'1554': durom.1980.21
'1555': durom.1980.23
'1556': durom.1980.28
'1557': durom.1980.3
'1558': durom.1980.35
'1559': durom.1980.38
'1560': durom.1980.53
'1561': durom.1980.54
'1562': durom.1980.61
'1563': durom.1980.62
'1564': durom.1980.64
'1565': durom.1980.68
'1566': durom.1980.85
'1567': durom.1980.95
'1568': durom.1981.103
'1569': durom.1981.107
'1570': durom.1981.124
'1571': durom.1983.14
'1572': durom.1983.19
'1573': durom.1983.20
'1574': durom.1983.21
'1575': durom.1983.22
'1576': durom.1983.24
'1577': durom.1983.25
'1578': durom.1983.26
'1579': durom.1983.28
'1580': durom.1983.29
'1581': durom.1983.30
'1582': durom.1983.5
'1583': durom.1983.8
'1584': durom.1983.9
'1585': durom.1984.15
'1586': durom.1984.22
'1587': durom.1985.10
'1588': durom.1985.15
'1589': durom.1985.17
'1590': durom.1985.19
'1591': durom.1985.2
'1592': durom.1985.20
'1593': durom.1985.26
'1594': durom.1985.27
'1595': durom.1985.33
'1596': durom.1985.36
'1597': durom.1985.39
'1598': durom.1985.40
'1599': durom.1985.45
'1600': durom.1985.46
'1601': durom.1985.47
'1602': durom.1985.48
'1603': durom.1985.49
'1604': durom.1985.50
'1605': durom.1985.51
'1606': durom.1985.54
'1607': durom.1985.56
'1608': durom.1985.57
'1609': durom.1985.7
'1610': durom.1986.6
'1611': durom.1986.7
'1612': durom.1986.d100
'1613': durom.1986.d11
'1614': durom.1986.d119
'1615': durom.1986.d12
'1616': durom.1986.d122
'1617': durom.1986.d124
'1618': durom.1986.d126
'1619': durom.1986.d127
'1620': durom.1986.d128
'1621': durom.1986.d131
'1622': durom.1986.d132
'1623': durom.1986.d136
'1624': durom.1986.d137
'1625': durom.1986.d140
'1626': durom.1986.d143
'1627': durom.1986.d15
'1628': durom.1986.d150
'1629': durom.1986.d151
'1630': durom.1986.d154
'1631': durom.1986.d16
'1632': durom.1986.d161
'1633': durom.1986.d17
'1634': durom.1986.d177
'1635': durom.1986.d18
'1636': durom.1986.d183
'1637': durom.1986.d186
'1638': durom.1986.d187
'1639': durom.1986.d195
'1640': durom.1986.d196
'1641': durom.1986.d198
'1642': durom.1986.d20
'1643': durom.1986.d22
'1644': durom.1986.d23
'1645': durom.1986.d25
'1646': durom.1986.d26
'1647': durom.1986.d27
'1648': durom.1986.d29
'1649': durom.1986.d3
'1650': durom.1986.d304
'1651': durom.1986.d31
'1652': durom.1986.d33
'1653': durom.1986.d35
'1654': durom.1986.d36
'1655': durom.1986.d37
'1656': durom.1986.d38
'1657': durom.1986.d4
'1658': durom.1986.d40
'1659': durom.1986.d41
'1660': durom.1986.d5
'1661': durom.1986.d57
'1662': durom.1986.d6
'1663': durom.1986.d60
'1664': durom.1986.d61
'1665': durom.1986.d64
'1666': durom.1986.d66
'1667': durom.1986.d7
'1668': durom.1986.d75
'1669': durom.1986.d77
'1670': durom.1986.d8
'1671': durom.1987.1
'1672': durom.1987.22
'1673': durom.1987.26
'1674': durom.1988.12
'1675': durom.1988.41
'1676': durom.1988.9
'1677': durom.1990.10
'1678': durom.1991.10
'1679': durom.1991.100
'1680': durom.1991.101
'1681': durom.1991.103
'1682': durom.1991.109
'1683': durom.1991.11
'1684': durom.1991.110
'1685': durom.1991.117
'1686': durom.1991.118
'1687': durom.1991.12
'1688': durom.1991.121
'1689': durom.1991.162
'1690': durom.1991.165
'1691': durom.1991.171
'1692': durom.1991.177
'1693': durom.1991.180
'1694': durom.1991.181
'1695': durom.1991.195
'1696': durom.1991.212
'1697': durom.1991.213
'1698': durom.1991.5
'1699': durom.1991.56
'1700': durom.1991.59
'1701': durom.1991.60
'1702': durom.1991.61
'1703': durom.1991.62
'1704': durom.1991.63
'1705': durom.1991.65
'1706': durom.1991.66
'1707': durom.1991.8
'1708': durom.1991.81
'1709': durom.1991.85
'1710': durom.1991.86
'1711': durom.1991.87
'1712': durom.1991.88
'1713': durom.1991.89
'1714': durom.1991.9
'1715': durom.1991.90
'1716': durom.1991.91
'1717': durom.1991.99
'1718': durom.1992.1
'1719': durom.1992.10
'1720': durom.1992.100
'1721': durom.1992.107
'1722': durom.1992.11
'1723': durom.1992.113
'1724': durom.1992.14
'1725': durom.1992.140
'1726': durom.1992.145
'1727': durom.1992.146
'1728': durom.1992.149
'1729': durom.1992.154
'1730': durom.1992.155
'1731': durom.1992.156
'1732': durom.1992.157
'1733': durom.1992.159
'1734': durom.1992.16
'1735': durom.1992.161
'1736': durom.1992.164
'1737': durom.1992.179
'1738': durom.1992.18
'1739': durom.1992.181
'1740': durom.1992.19
'1741': durom.1992.2
'1742': durom.1992.20
'1743': durom.1992.21
'1744': durom.1992.22
'1745': durom.1992.23
'1746': durom.1992.24
'1747': durom.1992.25
'1748': durom.1992.26
'1749': durom.1992.27
'1750': durom.1992.28
'1751': durom.1992.3
'1752': durom.1992.38
'1753': durom.1992.4
'1754': durom.1992.40
'1755': durom.1992.45
'1756': durom.1992.66
'1757': durom.1992.67
'1758': durom.1992.7
'1759': durom.1992.88
'1760': durom.1992.89
'1761': durom.1992.90
'1762': durom.1992.92
'1763': durom.1992.98
'1764': durom.1993.1
'1765': durom.1993.100
'1766': durom.1993.101
'1767': durom.1993.109
'1768': durom.1993.110
'1769': durom.1993.111
'1770': durom.1993.112
'1771': durom.1993.113
'1772': durom.1993.123
'1773': durom.1993.140
'1774': durom.1993.141
'1775': durom.1993.142
'1776': durom.1993.149
'1777': durom.1993.33
'1778': durom.1993.46
'1779': durom.1993.47
'1780': durom.1993.57
'1781': durom.1993.96
'1782': durom.1994.1
'1783': durom.1994.9
'1784': durom.1995.1
'1785': durom.1995.27
'1786': durom.1995.3
'1787': durom.1995.30
'1788': durom.1995.31
'1789': durom.1995.32
'1790': durom.1995.33
'1791': durom.1995.39
'1792': durom.1995.41
'1793': durom.1995.62
'1794': durom.1995.73
'1795': durom.1995.81
'1796': durom.1995.83
'1797': durom.1995.85
'1798': durom.1995.88
'1799': durom.1995.89
'1800': durom.1996.107
'1801': durom.1996.150
'1802': durom.1996.16
'1803': durom.1996.19
'1804': durom.1996.22
'1805': durom.1996.71
'1806': durom.1997.10
'1807': durom.1997.100
'1808': durom.1997.13
'1809': durom.1997.134
'1810': durom.1997.146
'1811': durom.1997.15
'1812': durom.1997.154
'1813': durom.1997.170
'1814': durom.1997.171
'1815': durom.1997.173
'1816': durom.1997.174
'1817': durom.1997.180
'1818': durom.1997.19
'1819': durom.1997.20
'1820': durom.1997.69
'1821': durom.1997.7
'1822': durom.1998.1.1
'1823': durom.1998.12
'1824': durom.1998.15
'1825': durom.1998.17
'1826': durom.1998.19
'1827': durom.1998.20
'1828': durom.1998.21.1
'1829': durom.1998.21.2
'1830': durom.1998.22
'1831': durom.1998.24
'1832': durom.1998.26
'1833': durom.1999.100
'1834': durom.1999.102
'1835': durom.1999.107
'1836': durom.1999.108
'1837': durom.1999.119
'1838': durom.1999.121
'1839': durom.1999.124
'1840': durom.1999.125
'1841': durom.1999.128
'1842': durom.1999.129
'1843': durom.1999.131
'1844': durom.1999.132
'1845': durom.1999.137.1
'1846': durom.1999.138
'1847': durom.1999.139
'1848': durom.1999.140
'1849': durom.1999.141
'1850': durom.1999.142
'1851': durom.1999.19
'1852': durom.1999.30
'1853': durom.1999.49
'1854': durom.1999.51
'1855': durom.1999.52
'1856': durom.1999.53
'1857': durom.1999.54
'1858': durom.1999.80
'1859': durom.1999.88
'1860': durom.1999.89
'1861': durom.1999.92
'1862': durom.1999.93
'1863': durom.1999.94
'1864': durom.1999.95
'1865': durom.1999.99
'1866': durom.2000.1
'1867': durom.2000.11
'1868': durom.2000.12
'1869': durom.2000.13
'1870': durom.2000.14
'1871': durom.2000.15
'1872': durom.2000.19
'1873': durom.2000.20
'1874': durom.2000.26
'1875': durom.2000.27
'1876': durom.2000.7
'1877': durom.2000.8
'1878': durom.2000.9
'1879': durom.2001.100
'1880': durom.2001.11
'1881': durom.2001.12
'1882': durom.2001.121
'1883': durom.2001.122
'1884': durom.2001.126.1
'1885': durom.2001.129
'1886': durom.2001.14
'1887': durom.2001.147
'1888': durom.2001.149
'1889': durom.2001.15
'1890': durom.2001.151
'1891': durom.2001.163
'1892': durom.2001.175
'1893': durom.2001.178
'1894': durom.2001.182
'1895': durom.2001.190
'1896': durom.2001.192
'1897': durom.2001.193
'1898': durom.2001.204
'1899': durom.2001.206
'1900': durom.2001.22
'1901': durom.2001.23
'1902': durom.2001.27
'1903': durom.2001.29
'1904': durom.2001.29.11
'1905': durom.2001.29.16
'1906': durom.2001.29.17
'1907': durom.2001.29.21
'1908': durom.2001.29.22
'1909': durom.2001.29.23
'1910': durom.2001.29.31
'1911': durom.2001.29.7
'1912': durom.2001.29.8
'1913': durom.2001.29.9
'1914': durom.2001.32
'1915': durom.2001.35.1
'1916': durom.2001.35.2
'1917': durom.2001.35.3
'1918': durom.2001.35.4
'1919': durom.2001.35.5
'1920': durom.2001.41
'1921': durom.2001.43
'1922': durom.2001.50
'1923': durom.2001.6
'1924': durom.2001.64.2
'1925': durom.2001.69.2
'1926': durom.2001.9
'1927': durom.2001.91.28
'1928': durom.2001.91.30
'1929': durom.2001.94
'1930': durom.2001.95
'1931': durom.2001.96.1
'1932': durom.2001.96.10
'1933': durom.2001.96.11
'1934': durom.2001.96.12
'1935': durom.2001.96.13
'1936': durom.2001.96.14
'1937': durom.2001.96.15
'1938': durom.2001.96.16
'1939': durom.2001.96.17
'1940': durom.2001.96.18
'1941': durom.2001.96.19
'1942': durom.2001.96.2
'1943': durom.2001.96.20
'1944': durom.2001.96.21
'1945': durom.2001.96.3
'1946': durom.2001.96.4
'1947': durom.2001.96.5
'1948': durom.2001.96.6
'1949': durom.2001.96.7
'1950': durom.2001.96.8
'1951': durom.2001.96.9
'1952': durom.2002.10
'1953': durom.2002.11
'1954': durom.2002.12
'1955': durom.2002.13
'1956': durom.2002.14
'1957': durom.2002.15
'1958': durom.2002.23
'1959': durom.2002.501
'1960': durom.2002.7
'1961': durom.2002.8
'1962': durom.2003.10
'1963': durom.2004.18
'1964': durom.2004.6
'1965': durom.2004.8
'1966': durom.2004.9
'1967': durom.2005.2
'1968': durom.2006.20
'1969': durom.2006.21
'1970': durom.2006.22
'1971': durom.2006.24.2
'1972': durom.2006.26
'1973': durom.2006.27
'1974': durom.2006.28
'1975': durom.2006.30
'1976': durom.2006.31
'1977': durom.2006.33
'1978': durom.2006.34
'1979': durom.2006.35
'1980': durom.2006.36
'1981': durom.2006.37
'1982': durom.2006.38
'1983': durom.2006.39
'1984': durom.2006.40
'1985': durom.2006.44
'1986': durom.2006.47
'1987': durom.2006.48
'1988': durom.2006.49
'1989': durom.2006.50
'1990': durom.2006.51
'1991': durom.2006.52
'1992': durom.2006.53
'1993': durom.2006.53.129
'1994': durom.2006.53.167
'1995': durom.2006.53.168
'1996': durom.2006.53.169
'1997': durom.2006.53.170
'1998': durom.2006.53.173
'1999': durom.2006.53.174
'2000': durom.2006.53.178
'2001': durom.2006.53.184
'2002': durom.2006.53.191
'2003': durom.2006.53.21
'2004': durom.2006.53.22
'2005': durom.2006.53.23
'2006': durom.2006.53.26
'2007': durom.2006.53.27
'2008': durom.2006.53.31
'2009': durom.2006.53.32.1
'2010': durom.2006.53.34.1
'2011': durom.2006.53.36.1
'2012': durom.2006.53.37.1
'2013': durom.2006.53.37.3
'2014': durom.2006.53.38
'2015': durom.2006.53.39.1
'2016': durom.2006.53.40.1
'2017': durom.2006.53.40.6
'2018': durom.2006.53.40.8
'2019': durom.2006.53.41.1
'2020': durom.2006.53.44
'2021': durom.2006.53.46
'2022': durom.2006.53.82.1
'2023': durom.2006.53.91
'2024': durom.2006.62
'2025': durom.2006.63
'2026': durom.2006.65
'2027': durom.2006.68
'2028': durom.2008.2
'2029': durom.2008.4
'2030': durom.2009.1
'2031': durom.2009.2
'2032': durom.2009.3
'2033': durom.2009.74
'2034': durom.2009.75
'2035': durom.2009.8
'2036': durom.2009.9
'2037': durom.2010.14
'2038': durom.2010.22
'2039': durom.2010.25
'2040': durom.2010.43
'2041': durom.2010.48
'2042': durom.2010.49
'2043': durom.2010.71
'2044': durom.2011.12
'2045': durom.2011.4
'2046': durom.2011.5
'2047': durom.2011.6
'2048': durom.2011.61
'2049': durom.2011.63
'2050': durom.2011.64
'2051': durom.2011.7
'2052': durom.2011.8
'2053': durom.2012.10
'2054': durom.2012.11
'2055': durom.2012.12
'2056': durom.2012.129
'2057': durom.2012.130
'2058': durom.2012.131
'2059': durom.2012.132
'2060': durom.2012.133
'2061': durom.2012.134
'2062': durom.2012.135
'2063': durom.2012.136
'2064': durom.2012.137
'2065': durom.2012.138
'2066': durom.2012.139
'2067': durom.2012.140
'2068': durom.2012.141
'2069': durom.2012.36
'2070': durom.2012.37
'2071': durom.2012.38
'2072': durom.2012.39
'2073': durom.2012.40
'2074': durom.2012.44
'2075': durom.2012.45
'2076': durom.2012.46
'2077': durom.2012.47
'2078': durom.2012.48
'2079': durom.2012.49
'2080': durom.2012.50
'2081': durom.2012.51
'2082': durom.2012.8
'2083': durom.2012.9
'2084': durom.2013.1
'2085': durom.2013.10
'2086': durom.2013.105
'2087': durom.2013.106
'2088': durom.2013.109
'2089': durom.2013.11
'2090': durom.2013.110
'2091': durom.2013.111
'2092': durom.2013.112
'2093': durom.2013.113
'2094': durom.2013.114
'2095': durom.2013.115
'2096': durom.2013.119
'2097': durom.2013.12
'2098': durom.2013.120
'2099': durom.2013.121
'2100': durom.2013.122
'2101': durom.2013.123
'2102': durom.2013.125
'2103': durom.2013.126
'2104': durom.2013.129
'2105': durom.2013.13
'2106': durom.2013.132
'2107': durom.2013.133
'2108': durom.2013.134
'2109': durom.2013.14
'2110': durom.2013.15
'2111': durom.2013.157
'2112': durom.2013.16
'2113': durom.2013.17
'2114': durom.2013.173
'2115': durom.2013.173.12
'2116': durom.2013.174
'2117': durom.2013.175
'2118': durom.2013.176
'2119': durom.2013.177
'2120': durom.2013.178
'2121': durom.2013.179
'2122': durom.2013.180
'2123': durom.2013.181
'2124': durom.2013.187
'2125': durom.2013.188
'2126': durom.2013.190
'2127': durom.2013.2
'2128': durom.2013.208
'2129': durom.2013.224
'2130': durom.2013.225
'2131': durom.2013.246
'2132': durom.2013.247
'2133': durom.2013.252
'2134': durom.2013.258
'2135': durom.2013.298.1
'2136': durom.2013.3
'2137': durom.2013.302
'2138': durom.2013.304
'2139': durom.2013.305
'2140': durom.2013.307
'2141': durom.2013.329
'2142': durom.2013.33.1
'2143': durom.2013.33.2
'2144': durom.2013.330
'2145': durom.2013.338
'2146': durom.2013.340.1
'2147': durom.2013.340.2
'2148': durom.2013.340.3
'2149': durom.2013.340.4
'2150': durom.2013.340.5
'2151': durom.2013.341.2
'2152': durom.2013.342.2
'2153': durom.2013.343
'2154': durom.2013.35
'2155': durom.2013.350
'2156': durom.2013.350.1
'2157': durom.2013.350.2
'2158': durom.2013.350.3
'2159': durom.2013.350.4
'2160': durom.2013.351
'2161': durom.2013.4
'2162': durom.2013.41
'2163': durom.2013.42
'2164': durom.2013.43
'2165': durom.2013.5
'2166': durom.2013.52
'2167': durom.2013.53
'2168': durom.2013.54
'2169': durom.2013.55
'2170': durom.2013.56
'2171': durom.2013.57
'2172': durom.2013.58
'2173': durom.2013.59
'2174': durom.2013.6
'2175': durom.2013.60
'2176': durom.2013.61
'2177': durom.2013.62
'2178': durom.2013.63
'2179': durom.2013.64
'2180': durom.2013.65
'2181': durom.2013.66
'2182': durom.2013.67
'2183': durom.2013.68
'2184': durom.2013.69
'2185': durom.2013.7
'2186': durom.2013.70
'2187': durom.2013.78
'2188': durom.2013.79
'2189': durom.2013.8
'2190': durom.2013.9
'2191': durom.2013.90
'2192': durom.2013.93
'2193': durom.2013.95
'2194': durom.2013.96.1
'2195': durom.2013.96.11
'2196': durom.2013.96.5
'2197': durom.2013.99
'2198': durom.2014.1
'2199': durom.2014.1.1
'2200': durom.2014.1.125
'2201': durom.2014.1.2
'2202': durom.2014.1.3
'2203': durom.2014.1.4
'2204': durom.2014.1.71
'2205': durom.2014.1.77
'2206': durom.2014.1.78
'2207': durom.2014.1.79
'2208': durom.2014.105
'2209': durom.2014.106
'2210': durom.2014.107
'2211': durom.2014.108
'2212': durom.2014.109
'2213': durom.2014.110
'2214': durom.2014.111
'2215': durom.2014.112
'2216': durom.2014.113
'2217': durom.2014.114
'2218': durom.2014.115
'2219': durom.2014.116
'2220': durom.2014.117
'2221': durom.2014.118
'2222': durom.2014.119
'2223': durom.2014.120
'2224': durom.2014.121
'2225': durom.2014.122
'2226': durom.2014.123
'2227': durom.2014.124
'2228': durom.2014.125
'2229': durom.2014.127
'2230': durom.2014.128
'2231': durom.2014.131
'2232': durom.2014.132
'2233': durom.2014.133
'2234': durom.2014.134
'2235': durom.2014.135
'2236': durom.2014.136
'2237': durom.2014.137
'2238': durom.2014.138
'2239': durom.2014.14
'2240': durom.2014.141
'2241': durom.2014.142
'2242': durom.2014.143
'2243': durom.2014.145
'2244': durom.2014.20
'2245': durom.2014.21
'2246': durom.2014.22
'2247': durom.2014.228
'2248': durom.2014.230
'2249': durom.2014.232
'2250': durom.2014.24
'2251': durom.2014.243
'2252': durom.2014.249
'2253': durom.2014.25
'2254': durom.2014.250
'2255': durom.2014.254
'2256': durom.2014.256
'2257': durom.2014.26
'2258': durom.2014.273
'2259': durom.2014.285
'2260': durom.2014.290
'2261': durom.2014.291
'2262': durom.2014.292
'2263': durom.2014.293
'2264': durom.2014.294
'2265': durom.2014.295
'2266': durom.2014.297
'2267': durom.2014.3
'2268': durom.2014.305
'2269': durom.2014.311
'2270': durom.2014.317
'2271': durom.2014.318
'2272': durom.2014.321
'2273': durom.2014.33
'2274': durom.2014.342.101
'2275': durom.2014.344
'2276': durom.2014.346.1
'2277': durom.2014.346.2
'2278': durom.2014.347
'2279': durom.2014.348
'2280': durom.2014.360
'2281': durom.2014.361
'2282': durom.2014.362
'2283': durom.2014.395
'2284': durom.2014.4
'2285': durom.2014.400
'2286': durom.2014.419
'2287': durom.2014.435
'2288': durom.2014.436
'2289': durom.2014.439
'2290': durom.2014.44
'2291': durom.2014.450
'2292': durom.2014.457
'2293': durom.2014.458
'2294': durom.2014.462
'2295': durom.2014.463
'2296': durom.2014.465
'2297': durom.2014.466
'2298': durom.2014.468
'2299': durom.2014.469
'2300': durom.2014.47
'2301': durom.2014.470
'2302': durom.2014.471
'2303': durom.2014.474
'2304': durom.2014.477
'2305': durom.2014.478
'2306': durom.2014.48
'2307': durom.2014.484
'2308': durom.2014.486
'2309': durom.2014.487
'2310': durom.2014.49
'2311': durom.2014.501
'2312': durom.2014.51
'2313': durom.2014.510
'2314': durom.2014.513
'2315': durom.2014.514
'2316': durom.2014.52
'2317': durom.2014.53
'2318': durom.2014.536
'2319': durom.2014.537
'2320': durom.2014.538
'2321': durom.2014.539
'2322': durom.2014.54
'2323': durom.2014.540
'2324': durom.2014.56
'2325': durom.2014.6
'2326': durom.2014.67
'2327': durom.2014.68
'2328': durom.2014.72
'2329': durom.2014.73
'2330': durom.2014.81
'2331': durom.2014.82.1
'2332': durom.2014.82.2
'2333': durom.2014.92
'2334': durom.2014.95
'2335': durom.2014.97
'2336': durom.2014.98
'2337': durom.2015.123
'2338': durom.2015.124
'2339': durom.2015.147
'2340': durom.2015.159
'2341': durom.2015.178
'2342': durom.2015.179
'2343': durom.2015.18
'2344': durom.2015.19
'2345': durom.2015.20
'2346': durom.2015.21
'2347': durom.2015.212
'2348': durom.2015.213
'2349': durom.2015.214
'2350': durom.2015.215
'2351': durom.2015.22
'2352': durom.2015.226
'2353': durom.2015.227
'2354': durom.2015.230
'2355': durom.2015.235
'2356': durom.2015.25
'2357': durom.2015.257
'2358': durom.2015.267
'2359': durom.2015.274
'2360': durom.2015.28
'2361': durom.2015.29
'2362': durom.2015.295
'2363': durom.2015.296
'2364': durom.2015.297
'2365': durom.2015.298
'2366': durom.2015.31
'2367': durom.2015.319
'2368': durom.2015.32
'2369': durom.2015.33
'2370': durom.2015.34
'2371': durom.2015.340.1
'2372': durom.2015.340.17
'2373': durom.2015.340.18
'2374': durom.2015.340.19
'2375': durom.2015.340.2
'2376': durom.2015.340.20
'2377': durom.2015.340.21
'2378': durom.2015.340.22
'2379': durom.2015.340.23
'2380': durom.2015.340.24
'2381': durom.2015.340.25
'2382': durom.2015.340.26
'2383': durom.2015.340.27
'2384': durom.2015.340.28
'2385': durom.2015.340.29
'2386': durom.2015.340.30
'2387': durom.2015.340.31
'2388': durom.2015.340.32
'2389': durom.2015.340.33
'2390': durom.2015.340.34
'2391': durom.2015.340.35
'2392': durom.2015.340.36
'2393': durom.2015.340.37
'2394': durom.2015.340.38
'2395': durom.2015.340.39
'2396': durom.2015.340.4
'2397': durom.2015.340.40
'2398': durom.2015.340.41
'2399': durom.2015.340.42
'2400': durom.2015.340.43
'2401': durom.2015.340.44
'2402': durom.2015.340.45
'2403': durom.2015.340.46
'2404': durom.2015.340.47
'2405': durom.2015.340.48
'2406': durom.2015.340.49
'2407': durom.2015.340.50
'2408': durom.2015.340.51
'2409': durom.2015.340.52
'2410': durom.2015.340.53
'2411': durom.2015.340.54
'2412': durom.2015.340.55
'2413': durom.2015.340.56
'2414': durom.2015.340.57
'2415': durom.2015.340.58
'2416': durom.2015.340.59
'2417': durom.2015.340.64
'2418': durom.2015.340.65
'2419': durom.2015.341
'2420': durom.2015.35
'2421': durom.2015.352
'2422': durom.2015.354
'2423': durom.2015.357
'2424': durom.2015.36
'2425': durom.2015.361
'2426': durom.2015.362
'2427': durom.2015.363
'2428': durom.2015.364
'2429': durom.2015.365
'2430': durom.2015.366
'2431': durom.2015.367
'2432': durom.2015.37
'2433': durom.2015.38
'2434': durom.2015.385.10
'2435': durom.2015.385.11
'2436': durom.2015.385.2
'2437': durom.2015.39
'2438': durom.2015.391
'2439': durom.2015.401
'2440': durom.2015.41
'2441': durom.2015.42
'2442': durom.2015.422
'2443': durom.2015.423.1
'2444': durom.2015.43
'2445': durom.2015.44
'2446': durom.2015.45
'2447': durom.2015.46
'2448': durom.2015.498
'2449': durom.2015.50
'2450': durom.2015.501
'2451': durom.2015.507
'2452': durom.2015.55
'2453': durom.2015.60
'2454': durom.2015.61
'2455': durom.2015.62
'2456': durom.2015.63
'2457': durom.2015.71
'2458': durom.2015.93
'2459': durom.2015.95
'2460': durom.2015.98
'2461': durom.2016.1
'2462': durom.2016.10
'2463': durom.2016.101
'2464': durom.2016.102
'2465': durom.2016.103.1
'2466': durom.2016.103.2
'2467': durom.2016.104.1
'2468': durom.2016.104.2
'2469': durom.2016.105
'2470': durom.2016.106
'2471': durom.2016.107
'2472': durom.2016.108
'2473': durom.2016.109
'2474': durom.2016.21
'2475': durom.2016.26
'2476': durom.2016.27
'2477': durom.2016.28
'2478': durom.2016.31
'2479': durom.2016.40
'2480': durom.2016.47
'2481': durom.2016.52
'2482': durom.2016.58
'2483': durom.2016.59
'2484': durom.2016.60
'2485': durom.2016.73
'2486': durom.2016.78
'2487': durom.2017.18
'2488': durom.2017.20
'2489': durom.2017.21
'2490': durom.2017.22
'2491': durom.2017.33
'2492': durom.2017.34
'2493': durom.2017.35
'2494': durom.2017.37
'2495': durom.2017.38
'2496': durom.2017.48
'2497': durom.2017.49
'2498': durom.2017.50
'2499': durom.2017.62
'2500': durom.2017.73
'2501': durom.2017.8
'2502': durom.2017.87
'2503': durom.2017.88
'2504': durom.2017.89
'2505': durom.2017.90
'2506': durom.2017.92
'2507': durom.2017.93
'2508': durom.2017.94
'2509': durom.2017.95
'2510': durom.2017.96
'2511': durom.2018.1
'2512': durom.2018.2
'2513': durom.272
'2514': durom.316
'2515': durom.370
'2516': durom.50
'2517': durom.60
'2518': durom.64
'2519': durom.69
'2520': durom.75
'2521': durom.81
'2522': duruc.1924.105
'2523': duruc.1924.15
'2524': duruc.1924.3.13
'2525': duruc.1924.3.5
'2526': duruc.1924.40
'2527': duruc.1924.42
'2528': duruc.1924.44
'2529': duruc.1924.45
'2530': duruc.1924.6.2
'2531': duruc.1924.76.1
'2532': duruc.1924.77.1
'2533': duruc.1924.77.2
'2534': duruc.1924.78
'2535': duruc.1924.79
'2536': duruc.2014.1.10
'2537': duruc.2014.1.2
'2538': duruc.2016.41
'2539': duruc.2016.44
'2540': duruc.2016.47.2
'2541': duruc.2016.50.1
'2542': duruc.2016.50.2
'2543': duruc.2018.13.1
'2544': duruc.2018.13.2
'2545': duruc.2018.14
'2546': duruc.2018.15
'2547': duruc.2018.16
'2548': duruc.2018.17
'2549': duruc.2018.18
'2550': duruc.2018.19
'2551': duruc.2018.20
'2552': duruc.2018.21
'2553': duruc.2018.22
'2554': duruc.2018.23
'2555': duruc.2018.24
'2556': duruc.2018.25
'2557': duruc.2018.26
'2558': duruc.2018.27
'2559': duruc.2018.3
'2560': duruc.2018.6
'2561': duruc.2018.7
'2562': duruc.2020.1
'2563': duruc.2020.2
'2564': duruc.2020.25.1
'2565': eg1002
'2566': eg1004
'2567': eg1012
'2568': eg1013
'2569': eg1030
'2570': eg1033
'2571': eg104
'2572': eg1041
'2573': eg1050
'2574': eg1054
'2575': eg1059
'2576': eg107
'2577': eg112
'2578': eg1141
'2579': eg117
'2580': eg1188
'2581': eg1190
'2582': eg1191
'2583': eg1193
'2584': eg1194
'2585': eg1195
'2586': eg1196
'2587': eg1197
'2588': eg1200
'2589': eg1216
'2590': eg1217
'2591': eg1218
'2592': eg1219
'2593': eg1236
'2594': eg1237
'2595': eg1239
'2596': eg1246
'2597': eg1247
'2598': eg1248
'2599': eg1255
'2600': eg1257
'2601': eg1259
'2602': eg1260
'2603': eg1273
'2604': eg1276
'2605': eg1280
'2606': eg1282
'2607': eg1283
'2608': eg1284
'2609': eg1285
'2610': eg1286
'2611': eg1287
'2612': eg1288
'2613': eg1289
'2614': eg1290
'2615': eg1293
'2616': eg1294
'2617': eg1302
'2618': eg1329
'2619': eg133
'2620': eg1335
'2621': eg135
'2622': eg1374
'2623': eg138
'2624': eg139
'2625': eg1414
'2626': eg1415
'2627': eg1416
'2628': eg1423
'2629': eg1424
'2630': eg1427
'2631': eg143
'2632': eg1438
'2633': eg1439
'2634': eg1440
'2635': eg1449
'2636': eg147
'2637': eg1491
'2638': eg1493
'2639': eg1500
'2640': eg1506
'2641': eg1508
'2642': eg151
'2643': eg1522
'2644': eg1523
'2645': eg1524
'2646': eg1528
'2647': eg1529
'2648': eg1550
'2649': eg1552
'2650': eg156
'2651': eg1561
'2652': eg1568
'2653': eg1603
'2654': eg1605
'2655': eg1615
'2656': eg1616
'2657': eg1618
'2658': eg1620
'2659': eg165
'2660': eg1656
'2661': eg1659
'2662': eg167
'2663': eg1689
'2664': eg1690
'2665': eg1695
'2666': eg1727
'2667': eg1729
'2668': eg1730
'2669': eg1733
'2670': eg1742
'2671': eg1745
'2672': eg1747
'2673': eg1748
'2674': eg1749
'2675': eg175
'2676': eg1752
'2677': eg1786
'2678': eg1789
'2679': eg1790
'2680': eg1791
'2681': eg1793
'2682': eg1796
'2683': eg1801
'2684': eg1803
'2685': eg1809
'2686': eg1813
'2687': eg1814
'2688': eg1824
'2689': eg1826
'2690': eg1833
'2691': eg1838
'2692': eg1840
'2693': eg1842
'2694': eg1845
'2695': eg1850
'2696': eg1851
'2697': eg1856
'2698': eg1857
'2699': eg1859
'2700': eg186
'2701': eg1861
'2702': eg1862
'2703': eg1865
'2704': eg1874
'2705': eg1933
'2706': eg1934
'2707': eg1936
'2708': eg1948
'2709': eg196
'2710': eg200
'2711': eg204
'2712': eg2070
'2713': eg2076
'2714': eg2084
'2715': eg2091
'2716': eg211
'2717': eg212
'2718': eg2133
'2719': eg2134
'2720': eg215
'2721': eg2150
'2722': eg2155
'2723': eg2168
'2724': eg2173
'2725': eg2175
'2726': eg2176
'2727': eg2182
'2728': eg2185
'2729': eg2186
'2730': eg2188
'2731': eg2195
'2732': eg2200
'2733': eg2204
'2734': eg221
'2735': eg2222
'2736': eg2223
'2737': eg2239
'2738': eg2251
'2739': eg2261
'2740': eg2280
'2741': eg2281
'2742': eg2283
'2743': eg2284
'2744': eg2291
'2745': eg2300
'2746': eg2301
'2747': eg2305
'2748': eg2309
'2749': eg2325
'2750': eg2353
'2751': eg2356
'2752': eg2361
'2753': eg2364
'2754': eg2366
'2755': eg2400
'2756': eg2402
'2757': eg2472
'2758': eg2481
'2759': eg2493
'2760': eg2495.1
'2761': eg2495.2
'2762': eg2498
'2763': eg2526
'2764': eg2540
'2765': eg2547
'2766': eg2548
'2767': eg264
'2768': eg2647
'2769': eg2670
'2770': eg2733
'2771': eg2735
'2772': eg2738
'2773': eg2749
'2774': eg2750
'2775': eg2762
'2776': eg2763
'2777': eg279
'2778': eg2797
'2779': eg2804.1
'2780': eg2810
'2781': eg2863
'2782': eg2918
'2783': eg2924
'2784': eg293
'2785': eg2930
'2786': eg2933
'2787': eg2948
'2788': eg2962
'2789': eg2973
'2790': eg2991
'2791': eg306
'2792': eg3062
'2793': eg307
'2794': eg3070
'2795': eg308
'2796': eg3083
'2797': eg309
'2798': eg310
'2799': eg311
'2800': eg3114
'2801': eg312
'2802': eg313
'2803': eg3138
'2804': eg3143
'2805': eg3151
'2806': eg317
'2807': eg3173
'2808': eg3180
'2809': eg3182
'2810': eg3195
'2811': eg3217
'2812': eg322
'2813': eg3224
'2814': eg3336
'2815': eg3337
'2816': eg3338
'2817': eg3349
'2818': eg3358
'2819': eg3372
'2820': eg3375
'2821': eg3382
'2822': eg3385
'2823': eg3388
'2824': eg3394
'2825': eg3424
'2826': eg3427
'2827': eg343
'2828': eg344
'2829': eg3475
'2830': eg3476
'2831': eg3482
'2832': eg3487
'2833': eg3488
'2834': eg3493
'2835': eg3502
'2836': eg3503
'2837': eg3504
'2838': eg3506
'2839': eg351
'2840': eg3510
'2841': eg3516
'2842': eg354
'2843': eg355
'2844': eg3557
'2845': eg3561
'2846': eg3565
'2847': eg3568
'2848': eg3569
'2849': eg3571
'2850': eg3572
'2851': eg3575
'2852': eg3576
'2853': eg3580
'2854': eg3621
'2855': eg366
'2856': eg3707
'2857': eg3785
'2858': eg3789
'2859': eg381
'2860': eg3815
'2861': eg3868
'2862': eg3873
'2863': eg3904
'2864': eg3906
'2865': eg391
'2866': eg392
'2867': eg393
'2868': eg3969
'2869': eg3971
'2870': eg3972
'2871': eg3974
'2872': eg3976
'2873': eg3979
'2874': eg3985
'2875': eg3989
'2876': eg3990
'2877': eg3991
'2878': eg3995
'2879': eg3996
'2880': eg3997
'2881': eg3998
'2882': eg3999
'2883': eg4000
'2884': eg4003
'2885': eg4004
'2886': eg4005
'2887': eg4006
'2888': eg4007
'2889': eg4008
'2890': eg4009
'2891': eg4010
'2892': eg4012
'2893': eg411
'2894': eg413
'2895': eg418
'2896': eg423
'2897': eg433
'2898': eg4347
'2899': eg4348
'2900': eg4362
'2901': eg4363
'2902': eg4364
'2903': eg4365
'2904': eg4366
'2905': eg4368
'2906': eg4369
'2907': eg4384
'2908': eg4387
'2909': eg4388
'2910': eg4389
'2911': eg4393
'2912': eg4394
'2913': eg4395
'2914': eg4396
'2915': eg4398
'2916': eg4402
'2917': eg4404
'2918': eg4406
'2919': eg4408
'2920': eg4409
'2921': eg4411
'2922': eg4413
'2923': eg4414
'2924': eg4415
'2925': eg4416
'2926': eg4417
'2927': eg4418
'2928': eg4444
'2929': eg4458
'2930': eg4462
'2931': eg4470
'2932': eg4495
'2933': eg4501
'2934': eg4510
'2935': eg4527
'2936': eg4568
'2937': eg457
'2938': eg4570
'2939': eg4572
'2940': eg4573
'2941': eg4575
'2942': eg4579
'2943': eg4581
'2944': eg4583
'2945': eg4589
'2946': eg460
'2947': eg4602
'2948': eg4626
'2949': eg465
'2950': eg466
'2951': eg467
'2952': eg468
'2953': eg469
'2954': eg471
'2955': eg474
'2956': eg475
'2957': eg483
'2958': eg4845
'2959': eg4847
'2960': eg493
'2961': eg4953
'2962': eg4954
'2963': eg4957
'2964': eg4959
'2965': eg4960
'2966': eg4962
'2967': eg4964
'2968': eg4966
'2969': eg4968
'2970': eg4970
'2971': eg498
'2972': eg500
'2973': eg5000
'2974': eg5003
'2975': eg5011
'2976': eg5017
'2977': eg503
'2978': eg5037
'2979': eg505
'2980': eg5059
'2981': eg5064
'2982': eg507
'2983': eg5072
'2984': eg5073
'2985': eg5075
'2986': eg5082
'2987': eg5083
'2988': eg5084
'2989': eg5085
'2990': eg509
'2991': eg5102
'2992': eg5103
'2993': eg5104
'2994': eg5106
'2995': eg5137
'2996': eg5139
'2997': eg5145
'2998': eg5149
'2999': eg5153
'3000': eg5169
'3001': eg517
'3002': eg518
'3003': eg5180
'3004': eg5181
'3005': eg519
'3006': eg520
'3007': eg521
'3008': eg5211
'3009': eg5213
'3010': eg5214
'3011': eg5215
'3012': eg5216
'3013': eg5222
'3014': eg5225
'3015': eg5227
'3016': eg5229
'3017': eg523
'3018': eg5231
'3019': eg5232
'3020': eg5235
'3021': eg524
'3022': eg5242
'3023': eg5245
'3024': eg5253
'3025': eg5255
'3026': eg5256
'3027': eg5257
'3028': eg5258
'3029': eg5259
'3030': eg526
'3031': eg5260
'3032': eg5267
'3033': eg5269
'3034': eg5270
'3035': eg5273
'3036': eg5274
'3037': eg5278
'3038': eg5279
'3039': eg528
'3040': eg5281
'3041': eg5288
'3042': eg5321
'3043': eg5326
'3044': eg5327
'3045': eg5328
'3046': eg5329
'3047': eg5331
'3048': eg5333
'3049': eg5344
'3050': eg5347
'3051': eg535
'3052': eg5355
'3053': eg5376
'3054': eg5380
'3055': eg5382
'3056': eg5383
'3057': eg540
'3058': eg541
'3059': eg542
'3060': eg5428
'3061': eg5432
'3062': eg5434
'3063': eg5435
'3064': eg5573
'3065': eg5588
'3066': eg5593
'3067': eg5597
'3068': eg5707
'3069': eg571
'3070': eg5712
'3071': eg5715
'3072': eg5717
'3073': eg572
'3074': eg573
'3075': eg575
'3076': eg577
'3077': eg578
'3078': eg5809
'3079': eg582
'3080': eg5850
'3081': eg5851
'3082': eg605
'3083': eg6054
'3084': eg6057
'3085': eg608
'3086': eg609
'3087': eg613
'3088': eg6135
'3089': eg6140
'3090': eg6146
'3091': eg6148
'3092': eg615
'3093': eg6150
'3094': eg6164
'3095': eg6184
'3096': eg6190
'3097': eg6203
'3098': eg6226
'3099': eg6251
'3100': eg6306
'3101': eg6352
'3102': eg6360
'3103': eg64
'3104': eg6417
'3105': eg6419
'3106': eg6433
'3107': eg6435
'3108': eg6446
'3109': eg6449
'3110': eg6724
'3111': eg6739
'3112': eg6767
'3113': eg6785
'3114': eg6786
'3115': eg6797
'3116': eg6798
'3117': eg6802
'3118': eg6803
'3119': eg6807
'3120': eg6808
'3121': eg6809
'3122': eg6814
'3123': eg6818
'3124': eg6822
'3125': eg6854
'3126': eg6857
'3127': eg6883
'3128': eg6897
'3129': eg6917
'3130': eg6951
'3131': eg6955
'3132': eg6957
'3133': eg6972
'3134': eg6974
'3135': eg6977
'3136': eg6978
'3137': eg713
'3138': eg715
'3139': eg716
'3140': eg718
'3141': eg720
'3142': eg721
'3143': eg722
'3144': eg724
'3145': eg726
'3146': eg727
'3147': eg732
'3148': eg733
'3149': eg741
'3150': eg749
'3151': eg75
'3152': eg768
'3153': eg774
'3154': eg775
'3155': eg776
'3156': eg779
'3157': eg881
'3158': eg882
'3159': eg883
'3160': eg885
'3161': eg898
'3162': eg901
'3163': eg907
'3164': eg909
'3165': eg913
'3166': eg914
'3167': eg916
'3168': eg917
'3169': eg930
'3170': eg934
'3171': eg936
'3172': eg938
'3173': eg950
'3174': eg964
'3175': eg967
- name: file
dtype: string
- name: image
dtype: image
- name: root
dtype: string
- name: description
dtype: string
- name: object_name
dtype: string
- name: other_name
dtype: string
- name: material
dtype: string
- name: production.period
dtype: string
- name: production.place
dtype: string
splits:
- name: validation
num_bytes: 407362176.052
num_examples: 3782
- name: test
num_bytes: 437426561.852
num_examples: 3782
- name: train
num_bytes: 2919909278.325
num_examples: 80365
download_size: 4210194967
dataset_size: 3764698016.229
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
houck2040/satire | ---
license: mit
---
|
blindsubmissions/GH_text2code | ---
dataset_info:
features:
- name: identifier
dtype: string
- name: parameters
dtype: string
- name: docstring
dtype: string
- name: docstring_summary
dtype: string
- name: function
dtype: string
- name: function_tokens
sequence: string
- name: start_point
sequence: int64
- name: end_point
sequence: int64
- name: language
dtype: string
- name: docstring_language
dtype: string
- name: docstring_language_predictions
dtype: string
- name: is_langid_reliable
dtype: string
splits:
- name: python_gh
num_bytes: 36300760423
num_examples: 15000002
- name: java_gh
num_bytes: 21613057110
num_examples: 15000014
- name: go_gh
num_bytes: 22559741937
num_examples: 15000078
- name: javascript_gh
num_bytes: 3895688311
num_examples: 2000040
download_size: 166324499
dataset_size: 84369247781
task_categories:
- translation
- summarization
- text2text-generation
language:
- en
tags:
- code
size_categories:
- 10M<n<100M
---
# Docstring to code data
## Dataset Summary
This dataset contains pairs of English text and code from multiple programming language pairs. Namely, text is paired with code snippets for: Python, Java, JavaScript, and Go. The data is curated via an automated filtering pipeline from source files within [The Stack](https://huggingface.co/datasets/bigcode/the-stack).
## Supported Tasks
This dataset can be used to finetune models for code-to-text and/or text-to-code models, both on information retrieval or conditional generation settings.
## Splits
```python
DATA_SPLITS = {"python_gh", "java_gh", "javascript_gh", "go_gh"}
```
## How to get the data with a given programming language
```python
from datasets import load_dataset
def get_dataset(prog_lang):
test_data = load_dataset("blindsubmissions/GH_text2code", split=prog_lang)
return test_data
```
## Dataset Structure
### Data Instances
Each data instance corresponds to function/methods occurring in licensed files that compose The Stack. That is, files with permissive licences collected from GitHub.
### Relevant Data Fields
- identifier (string): Function/method name.
- parameters (string): Function parameters.
- return_statement (string): Return statement if found during parsing.
- docstring (string): Complete docstring content.
- docstring_summary (string): Summary/processed docstring dropping args and return statements.
- function (string): Actual function/method content.
- argument_list (null): List of arguments.
- language (string): Programming language of the function.
- type (string): Return type if found during parsing.
## Summary of data curation pipeline
- Filtering out repositories that appear in [CodeSearchNet](https://huggingface.co/datasets/code_search_net).
- Filtering the files that belong to the programming languages of interest.
- Pre-filtering the files that likely contain text in the natural languages of interest.
- AST parsing with [Tree-sitter](\url{https://tree-sitter.github.io/tree-sitter/).
- Perform language identification of docstrings in the resulting set of functions/methods and select the ones classified as English via majority voting.
## Social Impact of the dataset
This dataset is released with the aim to increase the availability of training data available to the NLP for code research community by providing text/code paired data. We expect this data to help enable more accurate information retrieval systems and text-to-code or code-to-text summarization.
As a subset of The Stack, this dataset inherits de-risking efforts carried out when that dataset was built, though we highlight risks exist and malicious use of the data could exist such as, for instance, to aid on creation of malicious code. We highlight however that this is a risk shared by any code dataset made openly available.
Moreover, we remark that the data may contain harmful or offensive language, which could be learned by models trained on it.
## Discussion of Biases
The data is collected from GitHub and naturally occurring text on that platform. As a consequence, certain languages are more or less likely to contain well documented code and, as such, resulting data will not be uniformly represented in terms of their programing languages.
## Known limitations
The dataset can be expanded to further improve its coverage.
Moreover, we use text naturally occurring as comments or docstrings as opposed to human annotators. As such, resulting data will have high variance in terms of quality depending on practices of sub-communities of software developers. However, we remark that the task our evaluation dataset defines is reflective of what searching on a real codebase would look like.
Finally, we note that some imbalance on data is observed due to the same reason since certain languages are more or less likely to contain well documented code.
## Maintenance plan:
The data will be kept up to date by following The Stack releases. We should rerun our pipeline for every new release and add non-overlapping new content to both training and testing partitions of our data.
This is so that we carry over opt-out updates and include fresh repos.
## Update plan:
- Cover all 6 programming languages from CodeSearchNet.
## Licensing Information
M2CRB is a subset filtered and pre-processed from [The Stack](https://huggingface.co/datasets/bigcode/the-stack), a collection of source code from repositories with various licenses. Any use of all or part of the code gathered in M2CRB must abide by the terms of the original licenses. |
heliosprime/twitter_dataset_1713155983 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9763
num_examples: 24
download_size: 12473
dataset_size: 9763
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713155983"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lansinuote/diffusion.4.text_to_image | ---
dataset_info:
features:
- name: image
dtype: image
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 119636585.0
num_examples: 833
download_size: 0
dataset_size: 119636585.0
---
# Dataset Card for "diffusion.4.text_to_image"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pln-udelar/uy22 | ---
license: mit
language:
- es
pretty_name: uy22
--- |
musfiqdehan/preprocessed-BanglaNMT | ---
license: cc-by-sa-4.0
---
|
hazyresearch/based-swde-old | ---
license: apache-2.0
---
|
heinrichreimer/health-questions | ---
language:
- en
tags:
- Health
- Question Answering
size_categories:
- 1M<n<10M
dataset_info:
- config_name: silver
features:
- name: id
dtype: string
- name: text
dtype: string
- name: health_related_label
dtype:
class_label:
names:
'0': not_health_related
'1': health_related
- name: medical_label
dtype:
class_label:
names:
'0': not_medical
'1': medical
splits:
- name: train
num_bytes: 750040934
num_examples: 6835271
- name: validation
num_bytes: 187523993
num_examples: 1708818
download_size: 0
dataset_size: 937564927
- config_name: golden
features:
- name: id
dtype: string
- name: text
dtype: string
- name: health_related_label
dtype:
class_label:
names:
'0': not_health_related
'1': health_related
- name: medical_label
dtype:
class_label:
names:
'0': not_medical
'1': medical
splits:
- name: test
num_bytes: 163495
num_examples: 1489
- name: train
num_bytes: 489298
num_examples: 4466
- name: validation
num_bytes: 163015
num_examples: 1489
download_size: 0
dataset_size: 815808
---
# ⚕️ health-questions
TODO |
SeaEval/SeaEval_datasets | ---
license: cc-by-nc-4.0
configs:
- config_name: cross_xquad
data_files:
- split: test
path: "cross_xquad.json"
- config_name: cross_mmlu
data_files:
- split: test
path: "cross_mmlu.json"
- config_name: cross_logiqa
data_files:
- split: test
path: "cross_logiqa.json"
- config_name: us_eval
data_files:
- split: test
path: "us_eval.json"
- config_name: sg_eval
data_files:
- split: test
path: "sg_eval.json"
- config_name: cn_eval
data_files:
- split: test
path: "cn_eval.json"
- config_name: ph_eval
data_files:
- split: test
path: "ph_eval.json"
- config_name: flores_ind2eng
data_files:
- split: test
path: "flores_ind2eng.json"
- config_name: flores_vie2eng
data_files:
- split: test
path: "flores_vie2eng.json"
- config_name: flores_zho2eng
data_files:
- split: test
path: "flores_zho2eng.json"
- config_name: flores_zsm2eng
data_files:
- split: test
path: "flores_zsm2eng.json"
- config_name: mmlu
data_files:
- split: test
path: "mmlu.json"
- config_name: mmlu_full
data_files:
- split: test
path: "mmlu_full.json"
- config_name: c_eval
data_files:
- split: test
path: "c_eval.json"
- config_name: c_eval_full
data_files:
- split: test
path: "c_eval_full.json"
- config_name: cmmlu
data_files:
- split: test
path: "cmmlu.json"
- config_name: cmmlu_full
data_files:
- split: test
path: "cmmlu_full.json"
- config_name: zbench
data_files:
- split: test
path: "zbench.json"
- config_name: ind_emotion
data_files:
- split: test
path: "ind_emotion.json"
- config_name: ocnli
data_files:
- split: test
path: "ocnli.json"
- config_name: c3
data_files:
- split: test
path: "c3.json"
- config_name: dream
data_files:
- split: test
path: "dream.json"
- config_name: samsum
data_files:
- split: test
path: "samsum.json"
- config_name: dialogsum
data_files:
- split: test
path: "dialogsum.json"
- config_name: sst2
data_files:
- split: test
path: "sst2.json"
- config_name: cola
data_files:
- split: test
path: "cola.json"
- config_name: qqp
data_files:
- split: test
path: "qqp.json"
- config_name: mnli
data_files:
- split: test
path: "mnli.json"
- config_name: qnli
data_files:
- split: test
path: "qnli.json"
- config_name: wnli
data_files:
- split: test
path: "wnli.json"
- config_name: rte
data_files:
- split: test
path: "rte.json"
- config_name: mrpc
data_files:
- split: test
path: "mrpc.json"
- config_name: indommlu
data_files:
- split: test
path: "indommlu.json"
---
\[GitHub\]: https://github.com/SeaEval/SeaEval \[Website\]: https://seaeval.github.io/
```
@article{SeaEval,
title={SeaEval for Multilingual Foundation Models: From Cross-Lingual Alignment to Cultural Reasoning},
author={Wang, Bin and Liu, Zhengyuan and Huang, Xin and Jiao, Fangkai and Ding, Yang and Aw, Ai Ti and Chen, Nancy F.},
journal={NAACL},
year={2024}
}
``` |
tea90210/mltest | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 205326
num_examples: 100
download_size: 115128
dataset_size: 205326
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mltest"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_48 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1151647456.0
num_examples: 226168
download_size: 1172695090
dataset_size: 1151647456.0
---
# Dataset Card for "chunk_48"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
abross/youtube-transcriptions | ---
license: afl-3.0
---
|
alarmod/MRI | ---
license: gpl-3.0
---
|
JoseGamer/Myvoice | ---
license: openrail
---
|
Shubh8434/All | ---
license: apache-2.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.