datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Qdrant/wolt-food-clip-ViT-B-32-embeddings | ---
language:
- en
pretty_name: clip-ViT-V-32 embeddings of the Wolt food images
task_categories:
- feature-extraction
size_categories:
- 1M<n<10M
---
# wolt-food-clip-ViT-B-32-embeddings
Qdrant's [Food Discovery](https://food-discovery.qdrant.tech/) demo relies on the dataset of food images from the Wolt
app. Each point in the collection represents a dish with a single image. The image is represented as a vector of 512
float numbers.
## Generation process
The embeddings generated with clip-ViT-B-32 model have been generated using the following code snippet:
```python
from PIL import Image
from sentence_transformers import SentenceTransformer
image_path = "5dbfd216-5cce-11eb-8122-de94874ad1c8_ns_takeaway_seelachs_ei_baguette.jpeg"
model = SentenceTransformer("clip-ViT-B-32")
embedding = model.encode(Image.open(image_path))
``` |
frncscp/patacon-730-redux3d | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Patacon-False
'1': Patacon-True
- name: pca
sequence:
sequence: float64
- name: index
dtype: int64
splits:
- name: train
num_bytes: 2907513752.0
num_examples: 874
- name: validation
num_bytes: 476973727.0
num_examples: 143
- name: test
num_bytes: 1471669138.0
num_examples: 442
download_size: 3108353305
dataset_size: 4856156617.0
---
# Dataset Card for "patacon-730-redux3d"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO | ---
pretty_name: Evaluation run of eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO](https://huggingface.co/eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T18:58:11.659333](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO/blob/main/results_2024-02-29T18-58-11.659333.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6539804745320416,\n\
\ \"acc_stderr\": 0.0320592360759075,\n \"acc_norm\": 0.6534931846624927,\n\
\ \"acc_norm_stderr\": 0.03272907337494822,\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7745156520967758,\n\
\ \"mc2_stderr\": 0.013796529706600775\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068745,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710695\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7152957578171679,\n\
\ \"acc_stderr\": 0.0045035118550500325,\n \"acc_norm\": 0.8908583947420833,\n\
\ \"acc_norm_stderr\": 0.003111795320787943\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726855,\n \"\
acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726855\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n \"\
acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4435754189944134,\n\
\ \"acc_stderr\": 0.016615680401003724,\n \"acc_norm\": 0.4435754189944134,\n\
\ \"acc_norm_stderr\": 0.016615680401003724\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015055,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015055\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.7745156520967758,\n\
\ \"mc2_stderr\": 0.013796529706600775\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.010099208246065597\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6944655041698257,\n \
\ \"acc_stderr\": 0.012688134076726879\n }\n}\n```"
repo_url: https://huggingface.co/eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|arc:challenge|25_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|gsm8k|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hellaswag|10_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-58-11.659333.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T18-58-11.659333.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- '**/details_harness|winogrande|5_2024-02-29T18-58-11.659333.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T18-58-11.659333.parquet'
- config_name: results
data_files:
- split: 2024_02_29T18_58_11.659333
path:
- results_2024-02-29T18-58-11.659333.parquet
- split: latest
path:
- results_2024-02-29T18-58-11.659333.parquet
---
# Dataset Card for Evaluation run of eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO](https://huggingface.co/eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T18:58:11.659333](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO/blob/main/results_2024-02-29T18-58-11.659333.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6539804745320416,
"acc_stderr": 0.0320592360759075,
"acc_norm": 0.6534931846624927,
"acc_norm_stderr": 0.03272907337494822,
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7745156520967758,
"mc2_stderr": 0.013796529706600775
},
"harness|arc:challenge|25": {
"acc": 0.7013651877133106,
"acc_stderr": 0.013374078615068745,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710695
},
"harness|hellaswag|10": {
"acc": 0.7152957578171679,
"acc_stderr": 0.0045035118550500325,
"acc_norm": 0.8908583947420833,
"acc_norm_stderr": 0.003111795320787943
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726855,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4435754189944134,
"acc_stderr": 0.016615680401003724,
"acc_norm": 0.4435754189944134,
"acc_norm_stderr": 0.016615680401003724
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015055,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015055
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.7745156520967758,
"mc2_stderr": 0.013796529706600775
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.010099208246065597
},
"harness|gsm8k|5": {
"acc": 0.6944655041698257,
"acc_stderr": 0.012688134076726879
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-staging-eval-project-200453bd-7694965 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- masakhaner
eval_info:
task: entity_extraction
model: mbeukman/xlm-roberta-base-finetuned-swahili-finetuned-ner-wolof
metrics: []
dataset_name: masakhaner
dataset_config: swa
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: mbeukman/xlm-roberta-base-finetuned-swahili-finetuned-ner-wolof
* Dataset: masakhaner
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
alexinigoc/DatasetTraining | ---
license: afl-3.0
---
|
tj-solergibert/Europarl-ST-processed-mt-en | ---
dataset_info:
features:
- name: source_text
dtype: string
- name: dest_text
dtype: string
- name: dest_lang
dtype:
class_label:
names:
'0': de
'1': en
'2': es
'3': fr
'4': it
'5': nl
'6': pl
'7': pt
'8': ro
splits:
- name: train
num_bytes: 198087377
num_examples: 602605
- name: valid
num_bytes: 27678568
num_examples: 81968
- name: test
num_bytes: 29120332
num_examples: 86170
download_size: 104863110
dataset_size: 254886277
---
# Dataset Card for "Europarl-ST-processed-mt-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_aisquared__dlite-v2-774m | ---
pretty_name: Evaluation run of aisquared/dlite-v2-774m
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [aisquared/dlite-v2-774m](https://huggingface.co/aisquared/dlite-v2-774m) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_aisquared__dlite-v2-774m\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-13T06:47:53.119042](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v2-774m/blob/main/results_2023-10-13T06-47-53.119042.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.009437919463087249,\n\
\ \"em_stderr\": 0.0009901902239103345,\n \"f1\": 0.059256501677852416,\n\
\ \"f1_stderr\": 0.0015878342558663697,\n \"acc\": 0.26992896606156275,\n\
\ \"acc_stderr\": 0.007003882714182583\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.009437919463087249,\n \"em_stderr\": 0.0009901902239103345,\n\
\ \"f1\": 0.059256501677852416,\n \"f1_stderr\": 0.0015878342558663697\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5398579321231255,\n\
\ \"acc_stderr\": 0.014007765428365166\n }\n}\n```"
repo_url: https://huggingface.co/aisquared/dlite-v2-774m
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_13T06_47_53.119042
path:
- '**/details_harness|drop|3_2023-10-13T06-47-53.119042.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-13T06-47-53.119042.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_13T06_47_53.119042
path:
- '**/details_harness|gsm8k|5_2023-10-13T06-47-53.119042.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-13T06-47-53.119042.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:27:10.189986.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:27:10.189986.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:27:10.189986.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_13T06_47_53.119042
path:
- '**/details_harness|winogrande|5_2023-10-13T06-47-53.119042.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-13T06-47-53.119042.parquet'
- config_name: results
data_files:
- split: 2023_07_19T14_27_10.189986
path:
- results_2023-07-19T14:27:10.189986.parquet
- split: 2023_10_13T06_47_53.119042
path:
- results_2023-10-13T06-47-53.119042.parquet
- split: latest
path:
- results_2023-10-13T06-47-53.119042.parquet
---
# Dataset Card for Evaluation run of aisquared/dlite-v2-774m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/aisquared/dlite-v2-774m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [aisquared/dlite-v2-774m](https://huggingface.co/aisquared/dlite-v2-774m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_aisquared__dlite-v2-774m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T06:47:53.119042](https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v2-774m/blob/main/results_2023-10-13T06-47-53.119042.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.009437919463087249,
"em_stderr": 0.0009901902239103345,
"f1": 0.059256501677852416,
"f1_stderr": 0.0015878342558663697,
"acc": 0.26992896606156275,
"acc_stderr": 0.007003882714182583
},
"harness|drop|3": {
"em": 0.009437919463087249,
"em_stderr": 0.0009901902239103345,
"f1": 0.059256501677852416,
"f1_stderr": 0.0015878342558663697
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5398579321231255,
"acc_stderr": 0.014007765428365166
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
maruf7705/my-copy-model-just-try | ---
license: apache-2.0
---
|
jlbaker361/kaggle_females_dim_128_50k | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: src
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 1139379975.0
num_examples: 50000
download_size: 1137317892
dataset_size: 1139379975.0
---
# Dataset Card for "kaggle_females_dim_128_50k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
prnv13/landcover_data_rgb | ---
dataset_info:
features:
- name: image
dtype: image
- name: annotation
dtype: image
splits:
- name: train
num_bytes: 1343062608.0
num_examples: 513
- name: validation
num_bytes: 334584359.0
num_examples: 129
- name: test
num_bytes: 412226748.0
num_examples: 161
download_size: 2078607111
dataset_size: 2089873715.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_Undi95__CodeEngine | ---
pretty_name: Evaluation run of Undi95/CodeEngine
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/CodeEngine](https://huggingface.co/Undi95/CodeEngine) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__CodeEngine\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T07:16:21.496689](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__CodeEngine/blob/main/results_2023-10-25T07-16-21.496689.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.31008808724832215,\n\
\ \"em_stderr\": 0.004736734191590966,\n \"f1\": 0.4059154781879224,\n\
\ \"f1_stderr\": 0.004594505528583743,\n \"acc\": 0.38050967793280527,\n\
\ \"acc_stderr\": 0.00780116508471732\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.31008808724832215,\n \"em_stderr\": 0.004736734191590966,\n\
\ \"f1\": 0.4059154781879224,\n \"f1_stderr\": 0.004594505528583743\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.015163002274450341,\n \
\ \"acc_stderr\": 0.0033660229497263702\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7458563535911602,\n \"acc_stderr\": 0.012236307219708269\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Undi95/CodeEngine
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|arc:challenge|25_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T07_16_21.496689
path:
- '**/details_harness|drop|3_2023-10-25T07-16-21.496689.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T07-16-21.496689.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T07_16_21.496689
path:
- '**/details_harness|gsm8k|5_2023-10-25T07-16-21.496689.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T07-16-21.496689.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hellaswag|10_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T11-51-31.235775.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T11-51-31.235775.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T11-51-31.235775.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T07_16_21.496689
path:
- '**/details_harness|winogrande|5_2023-10-25T07-16-21.496689.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T07-16-21.496689.parquet'
- config_name: results
data_files:
- split: 2023_09_12T11_51_31.235775
path:
- results_2023-09-12T11-51-31.235775.parquet
- split: 2023_10_25T07_16_21.496689
path:
- results_2023-10-25T07-16-21.496689.parquet
- split: latest
path:
- results_2023-10-25T07-16-21.496689.parquet
---
# Dataset Card for Evaluation run of Undi95/CodeEngine
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/CodeEngine
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/CodeEngine](https://huggingface.co/Undi95/CodeEngine) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__CodeEngine",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T07:16:21.496689](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__CodeEngine/blob/main/results_2023-10-25T07-16-21.496689.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.31008808724832215,
"em_stderr": 0.004736734191590966,
"f1": 0.4059154781879224,
"f1_stderr": 0.004594505528583743,
"acc": 0.38050967793280527,
"acc_stderr": 0.00780116508471732
},
"harness|drop|3": {
"em": 0.31008808724832215,
"em_stderr": 0.004736734191590966,
"f1": 0.4059154781879224,
"f1_stderr": 0.004594505528583743
},
"harness|gsm8k|5": {
"acc": 0.015163002274450341,
"acc_stderr": 0.0033660229497263702
},
"harness|winogrande|5": {
"acc": 0.7458563535911602,
"acc_stderr": 0.012236307219708269
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dev7halo/korean-mcfaq | ---
license: apache-2.0
language:
- ko
---
## Usage
```bash
pip install datasets
```
```python
from datasets import load_dataset
dataset = load_dataset("dev7halo/korean-mcfaq")
```
```
DatasetDict({
train: Dataset({
features: ['Unnamed: 0', '제목', '등록일', '질문', '답변'],
num_rows: 2452
})
})
```
```
# dataset['train'][0]
{'Unnamed: 0': 0,
'제목': "'언젠가', '언젠가는'의 표현",
'등록일': '2019. 12. 6. ',
'질문': '\n\t\t \n\t\t \n\t\t"저는 언젠가 간호사가 되고 싶어요."와 같이 쓸 때, 미래의 불특정한 때를 나타내는 \'언젠가\'라는 단어를 \'언젠가는\'이라고 써도 되나요? \'언젠가\'가 표준어인 것 같은데, 뒤에 \'는\'을 쓴 \'언젠가는\'이 더 많이 쓰이는 것 같아요.\n\t\t \n\t\t \n\t',
'답변': "\n\t\t\t \n\t\t\t \n\t\t\t\xa0'미래의 어느 때에 가서는'을 뜻하는 부사 '언젠가'를 강조하기 위하여, '강조'의 뜻을 나타내는 보조사 '는'을 붙여 '언젠가는'과 같이 쓸 수 있습니다.\n\t\t\t \n\t\t"}
```
# Github
[Github](https://github.com/HaloKim/korean-mcfaq) |
yn01/test_20240125_03 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 156602
num_examples: 614
download_size: 32443
dataset_size: 156602
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Mxode/C-Language-Chat-Debug-Multiturn-Zh | ---
license: apache-2.0
task_categories:
- question-answering
- conversational
language:
- zh
tags:
- code
- C
- chat
- assistant
- multiturn
- debug
---
约 1300 条 **C 语言** 场景的 user - assistant **多轮对话**。每段对话已经组织成了单行的格式。一条样例如下:
```JSON
{
"id": 1045,
"conversation": [
{
"user": "你好,AI助手。我最近在写一个C语言程序,但是遇到了一些问题,希望你能帮我检查一下。",
"assistant": "你好,我很乐意帮助你。请把你的代码发给我,我会尽快检查并给出建议。"
},
{
"user": "好的,这是我的代码。这段代码的主要功能是计算斐波那契数列的前n项。",
"assistant": "让我看一下......嗯,这里有一个小错误。在第10行,你应该使用`++i`而不是`i++`来递增i的值。修改后的代码应该是这样的\\n```c\\nfor (int i = 0; i < n; ++i) {\\n if (i == 0 || i == 1) {\\n printf(\\\"%d \\\", fib[i]);\\n } else {\\n fib[i] = fib[i - 1] + fib[i - 2];\\n printf(\\\"%d \\\", fib[i]);\\n }\\n}\\n```"
},
{
"user": "谢谢你的帮助!我还有一个问题。这段代码中有一个递归函数,用于计算斐波那契数列。但是我发现这个递归函数的效率很低,因为它会重复计算很多相同的子问题。有什么方法可以优化这个递归函数吗?",
"assistant": "当然可以。你可以尝试使用动态规划的方法来优化这个递归函数。这样可以将已经计算过的子问题的解存储起来,避免重复计算。下面是一个使用动态规划优化后的斐波那契数列计算函数\\n```c\\n#include <stdio.h>\\nint fib[100]; // 用于存储斐波那契数列的数组\\nvoid fib_dp(int n) {\\n fib[0] = 0;\\n fib[1] = 1;\\n for (int i = 2; i <= n; ++i) {\\n fib[i] = fib[i - 1] + fib[i - 2];\\n }\\n}\\nint main() {\\n int n = 10; // 你想要计算的斐波那契数列的项数\\n fib_dp(n);\\n for (int i = 0; i <= n; ++i) {\\n printf(\\\"%d \\\", fib[i]);\\n }\\n return 0;\\n}\\n```"
},
{
"user": "哇,这真是太棒了!你的建议大大提高了程序的运行速度。再次感谢你的帮助!\n",
"assistant": ""
}
]
}
```
所有的助手称呼均为“AI助手”,如果想要构建特定名称,**可以全局替换“AI助手”。** |
bigscience-data/roots_indic-as_wikipedia | ---
language: as
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_indic-as_wikipedia
# wikipedia
- Dataset uid: `wikipedia`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 3.2299 % of total
- 4.2071 % of en
- 5.6773 % of ar
- 3.3416 % of fr
- 5.2815 % of es
- 12.4852 % of ca
- 0.4288 % of zh
- 0.4286 % of zh
- 5.4743 % of indic-bn
- 8.9062 % of indic-ta
- 21.3313 % of indic-te
- 4.4845 % of pt
- 4.0493 % of indic-hi
- 11.3163 % of indic-ml
- 22.5300 % of indic-ur
- 4.4902 % of vi
- 16.9916 % of indic-kn
- 24.7820 % of eu
- 11.6241 % of indic-mr
- 9.8749 % of id
- 9.3489 % of indic-pa
- 9.4767 % of indic-gu
- 24.1132 % of indic-as
- 5.3309 % of indic-or
### BigScience processing steps
#### Filters applied to: en
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: fr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: es
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: ca
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: zh
#### Filters applied to: zh
#### Filters applied to: indic-bn
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: pt
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ur
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: vi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: id
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-as
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-or
- filter_wiki_user_titles
- dedup_document
- filter_remove_empty_docs
|
Arris/predis-predis-faiss | ---
license: mit
---
|
StudentLLM/Sampled_Orca_GPT4 | ---
language:
- en
size_categories:
- 10K<n<100K
license: mit
---
# Stratify Sampled Dataset of Open-Orca 🐬
This dataset is a stratified sampled dataset of Open-Orca's GPT-4 answered dataset(1M-GPT4-Augmented.parquet) [[Link](https://huggingface.co/datasets/Open-Orca/OpenOrca)]
For sampling the dataset stratify, `train_test_split` of scikit-learn library was used.
The specific setup of sampling is as follows:
- split_size: 0.05
- shuffle: True
- stratify: `'id'` of Open-Orca dataset |
thangvip/cti-dataset-split | ---
dataset_info:
- config_name: default
features:
- name: sentence_idx
dtype: int64
- name: words
sequence: string
- name: POS
sequence: int64
- name: tag
sequence: int64
splits:
- name: train
num_bytes: 16917605
num_examples: 17480
download_size: 2164774
dataset_size: 16917605
- config_name: subset1
features:
- name: sentence_idx
dtype: int64
- name: words
sequence: string
- name: POS
sequence: int64
- name: tag
sequence: int64
splits:
- name: train
num_bytes: 13350196.989130436
num_examples: 13794
download_size: 2008529
dataset_size: 13350196.989130436
- config_name: subset2
features:
- name: sentence_idx
dtype: int64
- name: words
sequence: string
- name: POS
sequence: int64
- name: tag
sequence: int64
splits:
- name: test
num_bytes: 3338033.1604691073
num_examples: 3449
download_size: 502967
dataset_size: 3338033.1604691073
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: subset1
data_files:
- split: train
path: subset1/train-*
- config_name: subset2
data_files:
- split: test
path: subset2/test-*
---
```python
#these dictionary are useful for this dataset
pos_2_id = {'#': 0, '$': 1, "''": 2, '(': 3, ')': 4, '.': 5, ':': 6, 'CC': 7, 'CD': 8, 'DT': 9, 'EX': 10, 'FW': 11, 'IN': 12, 'JJ': 13, 'JJR': 14, 'JJS': 15, 'MD': 16, 'NN': 17, 'NNP': 18, 'NNPS': 19, 'NNS': 20, 'PDT': 21, 'POS': 22, 'PRP': 23, 'PRP$': 24, 'RB': 25, 'RBR': 26, 'RBS': 27, 'RP': 28, 'TO': 29, 'VB': 30, 'VBD': 31, 'VBG': 32, 'VBN': 33, 'VBP': 34, 'VBZ': 35, 'WDT': 36, 'WP': 37, 'WP$': 38, 'WRB': 39}
id_2_pos = {0: '#', 1: '$', 2: "''", 3: '(', 4: ')', 5: '.', 6: ':', 7: 'CC', 8: 'CD', 9: 'DT', 10: 'EX', 11: 'FW', 12: 'IN', 13: 'JJ', 14: 'JJR', 15: 'JJS', 16: 'MD', 17: 'NN', 18: 'NNP', 19: 'NNPS', 20: 'NNS', 21: 'PDT', 22: 'POS', 23: 'PRP', 24: 'PRP$', 25: 'RB', 26: 'RBR', 27: 'RBS', 28: 'RP', 29: 'TO', 30: 'VB', 31: 'VBD', 32: 'VBG', 33: 'VBN', 34: 'VBP', 35: 'VBZ', 36: 'WDT', 37: 'WP', 38: 'WP$', 39: 'WRB'}
tag_2_id = {'B-application': 0, 'B-cve id': 1, 'B-edition': 2, 'B-file': 3, 'B-function': 4, 'B-hardware': 5, 'B-language': 6, 'B-method': 7, 'B-os': 8, 'B-parameter': 9, 'B-programming language': 10, 'B-relevant_term': 11, 'B-update': 12, 'B-vendor': 13, 'B-version': 14, 'I-application': 15, 'I-edition': 16, 'I-hardware': 17, 'I-os': 18, 'I-relevant_term': 19, 'I-update': 20, 'I-vendor': 21, 'I-version': 22, 'O': 23}
id_2_tag = {0: 'B-application', 1: 'B-cve id', 2: 'B-edition', 3: 'B-file', 4: 'B-function', 5: 'B-hardware', 6: 'B-language', 7: 'B-method', 8: 'B-os', 9: 'B-parameter', 10: 'B-programming language', 11: 'B-relevant_term', 12: 'B-update', 13: 'B-vendor', 14: 'B-version', 15: 'I-application', 16: 'I-edition', 17: 'I-hardware', 18: 'I-os', 19: 'I-relevant_term', 20: 'I-update', 21: 'I-vendor', 22: 'I-version', 23: 'O'}
```
|
rntc/blurb_bc5chem_a | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: type
dtype: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B
'2': I
splits:
- name: train
num_bytes: 52119955
num_examples: 4560
- name: validation
num_bytes: 50451683
num_examples: 4581
- name: test
num_bytes: 52811254
num_examples: 4797
download_size: 19224122
dataset_size: 155382892
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
KennethEnevoldsen/spontanous-speech-qa | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
annotations_creators:
- found
source_datasets:
- DDSC/partial-danish-gigaword-no-twitter
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 44345.110764430574
num_examples: 512
- name: test
num_bytes: 11172.889235569422
num_examples: 129
download_size: 37996
dataset_size: 55518
task_categories:
- question-answering
language:
- da
tags:
- conversational
pretty_name: Spontanous speech QA
size_categories:
- n<1K
---
# Spontanous speech QA
This dataset contains QA pairs from the spontaneous speech subsection of the Danish Gigaword.
The dataset is created from the [DDSC dataset](DDSC/partial-danish-gigaword-no-twitter) and
filtered to only include QA pairs where the question is less than 20 tokens and the answer is
at least 4 tokens long.
To find out more about the creation see the accompanying script. |
uatafaque/movehd | ---
license: openrail
---
|
cahya/instructions-de | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 19823735.602530096
num_examples: 41903
- name: test
num_bytes: 521814.1987349521
num_examples: 1103
- name: validation
num_bytes: 521814.1987349521
num_examples: 1103
download_size: 12101999
dataset_size: 20867363.999999996
---
# Dataset Card for "instructions-de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JessicaYuan/RefSegRS | ---
license: cc-by-4.0
---
# RRSIS: Referring Remote Sensing Image Segmentation
The RefSegRS dataset that is used in [RRSIS: Referring Remote Sensing Image Segmentation](https://arxiv.org/abs/2306.08625).
Please kindly cite our paper if you find our dataset useful.
~~~
@article{yuan2023rrsis,
title={RRSIS: Referring Remote Sensing Image Segmentation},
author={Yuan, Zhenghang and Mou, Lichao and Hua, Yuansheng and Zhu, Xiao Xiang},
journal={arXiv preprint arXiv:2306.08625},
year={2023}
}
~~~
|
DynamicSuperb/Text2Speech_LJSpeech | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: file
dtype: string
- name: text
dtype: string
- name: reference_speech_id
dtype: string
- name: reference_speech
dtype:
audio:
sampling_rate: 22050
- name: reference_speech_transcription
dtype: string
- name: label
dtype:
audio:
sampling_rate: 22050
- name: instruction
dtype: string
splits:
- name: test
num_bytes: 7684795920.0
num_examples: 13100
download_size: 7562872204
dataset_size: 7684795920.0
---
# Dataset Card for "Text2Speech_LJSpeech"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Parth/Drug_QA | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
tags:
- biology
pretty_name: Drugs Dosage Route Date Dataset
size_categories:
- 1K<n<10K
--- |
meto/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4307372
num_examples: 1000
download_size: 2282423
dataset_size: 4307372
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/bae444a7 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 176
num_examples: 10
download_size: 1329
dataset_size: 176
---
# Dataset Card for "bae444a7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Zeel/P1 | ---
license: mit
---
|
zhengxuanzenwu/vicuna-eval-with-gpt4 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
splits:
- name: test
num_bytes: 155143
num_examples: 80
download_size: 103930
dataset_size: 155143
---
# Dataset Card for "vicuna-eval-with-gpt4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cathaysa/ele-int | ---
dataset_info:
features:
- name: ELE
dtype: string
- name: INT
dtype: string
splits:
- name: train
num_bytes: 332689.5077658303
num_examples: 1339
- name: test
num_bytes: 83234.49223416965
num_examples: 335
download_size: 294391
dataset_size: 415924.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
RodrigoPeres/br-military-institutes-entrance-questions | ---
license: mit
---
|
liuyanchen1015/MULTI_VALUE_stsb_uninflect | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 69928
num_examples: 439
- name: test
num_bytes: 48896
num_examples: 351
- name: train
num_bytes: 274246
num_examples: 1888
download_size: 250487
dataset_size: 393070
---
# Dataset Card for "MULTI_VALUE_stsb_uninflect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
innat/Grapheme128x128 | ---
license: apache-2.0
tags:
- code
size_categories:
- 10K<n<100K
---

This data set is preprocess version of [this competition](https://www.kaggle.com/competitions/bengaliai-cv19) data set. The preprocess data is collected from [here](https://www.kaggle.com/datasets/iafoss/grapheme-imgs-128x128).
|
Hansollll/news_section | ---
dataset_info:
features:
- name: news_content
dtype: string
splits:
- name: train
num_bytes: 705995812
num_examples: 295635
download_size: 399317655
dataset_size: 705995812
---
# Dataset Card for "news_section"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Heitechsoft/Wizard-Vicuna-MPT | ---
license: apache-2.0
tags:
- chat
- conversational
- conversation
pretty_name: Wizard Vicuna MPT
size_categories:
- 100K<n<1M
---
An MPT-compatible version of [wizard_vicuna_70k_unfiltered](https://huggingface.co/datasets/ehartford/wizard_vicuna_70k_unfiltered) |
hoangvanvietanh/my_dataset_test | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 1536483.0
num_examples: 2
download_size: 1534169
dataset_size: 1536483.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Zappandy/COT-headline-generation | ---
license: apache-2.0
---
|
akshayrakheja/ray_data_1 | ---
license: apache-2.0
---
|
RUCAIBox/Commonsense-Generation | ---
language:
- en
multilinguality:
- monolingual
task_categories:
- other
task_ids: []
tags:
- commonsense-generation
---
This is the commonsense generation datasets collected by TextBox, including:
- CommonGen (cg).
The detail and leaderboard of each dataset can be found in [TextBox page](https://github.com/RUCAIBox/TextBox#dataset). |
DelMonte/MasterPrompts | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_qqp_aint_be | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 205291
num_examples: 1000
- name: test
num_bytes: 1947460
num_examples: 9782
- name: train
num_bytes: 1875413
num_examples: 9058
download_size: 2511422
dataset_size: 4028164
---
# Dataset Card for "MULTI_VALUE_qqp_aint_be"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
parambharat/malayalam_asr_corpus | ---
annotations_creators:
- found
language:
- ml
language_creators:
- found
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Malayalam ASR Corpus
size_categories:
- 1K<n<10K
source_datasets:
- extended|common_voice
- extended|openslr
tags: []
task_categories:
- automatic-speech-recognition
task_ids: []
---
# Dataset Card for [Malayalam Asr Corpus]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@parambharat](https://github.com/parambharat) for adding this dataset. |
recapper/Course_summaries_dataset | ---
language:
- en
license: apache-2.0
size_categories:
- 1M<n<10M
task_categories:
- summarization
- text2text-generation
task_ids: []
tags:
- conditional-text-generation
---
# About Dataset
The dataset consists of data from a bunch of youtube videos ranging from videos from fastai lessons, FSDL lesson to random videos teaching something.
In total this dataset contains 600 chapter markers in youtube and contains 25, 000 lesson transcript.
This dataset can be used for NLP tasks like summarization, topic segmentation etc. You can refer to some of the models we have trained with this dataset
in [github repo link](https://github.com/ohmeow/fsdl_2022_course_project) for Full stack deep learning 2022 projects.
|
empower-dev/function_calling_eval_parallel_call_v0 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: functions
dtype: string
splits:
- name: train
num_bytes: 633157
num_examples: 95
download_size: 41437
dataset_size: 633157
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dmayhem93/self-critiquing-critique-test | ---
dataset_info:
features:
- name: id
dtype: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: time
dtype: float64
- name: labeler
dtype: string
- name: is_topic_based_summarization
dtype: bool
- name: category
dtype: string
- name: severity
dtype: int64
- name: text_quotes
list:
- name: begin
dtype: int64
- name: end
dtype: int64
- name: response_quotes
list:
- name: begin
dtype: int64
- name: end
dtype: int64
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 43153769
num_examples: 9437
download_size: 5768752
dataset_size: 43153769
---
# Dataset Card for "self-critiquing-critique-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MrBananaHuman/kor_ethical_question_answer | ---
license: cc-by-nc-nd-4.0
---
|
gundlapalli/llme2_sft_dataset_rlaif | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 7797
num_examples: 5
download_size: 14594
dataset_size: 7797
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Chaymaa/grdf-inference-aug-iter2-v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 38077646.18637993
num_examples: 446
- name: valid
num_bytes: 9811226.279569892
num_examples: 111
- name: test
num_bytes: 78617.53405017921
num_examples: 1
download_size: 45561066
dataset_size: 47967490.00000001
---
# Dataset Card for "grdf-inference-aug-iter2-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
modelloosrvcc/virus8bit | ---
license: openrail
---
|
CyberHarem/celina_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of celina (Fire Emblem)
This is the dataset of celina (Fire Emblem), containing 20 images and their tags.
The core tags of this character are `blonde_hair, earrings, blue_eyes, breasts, long_hair, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 22.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/celina_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 13.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/celina_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 41 | 26.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/celina_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 20.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/celina_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 41 | 35.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/celina_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/celina_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | jewelry, solo, 1girl, cape, elbow_gloves, simple_background, thighhighs, belt, book, thigh_boots, white_gloves, cleavage, collarbone, full_body, green_eyes, looking_at_viewer, smile, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | jewelry | solo | 1girl | cape | elbow_gloves | simple_background | thighhighs | belt | book | thigh_boots | white_gloves | cleavage | collarbone | full_body | green_eyes | looking_at_viewer | smile | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------|:-------|:--------|:-------|:---------------|:--------------------|:-------------|:-------|:-------|:--------------|:---------------|:-----------|:-------------|:------------|:-------------|:--------------------|:--------|:-------------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
d0rj/ShareGPT4V-ru | ---
dataset_info:
features:
- name: image
dtype: string
- name: id
dtype: string
- name: conversations
sequence: string
splits:
- name: train
num_bytes: 192931273
num_examples: 102025
download_size: 82097525
dataset_size: 192931273
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- visual-question-answering
- question-answering
- conversational
language:
- ru
size_categories:
- 100K<n<1M
language_creators:
- translated
multilinguality:
- monolingual
source_datasets:
- Lin-Chen/ShareGPT4V
pretty_name: ShareGPT4V (ru)
paperswithcode_id: sharegpt4v
license: cc-by-nc-4.0
tags:
- chat
- visual-chat
- multimodal-chat
---
# ShareGPT4V-ru
## Dataset Description
- **Paper:** https://huggingface.co/papers/2311.12793
- **Repository** https://github.com/InternLM/InternLM-XComposer/tree/main/projects/ShareGPT4V
- **Homepage** https://ShareGPT4V.github.io/
This is translated version **ShareGPT4V(102k)** subset of [Lin-Chen/ShareGPT4V](https://huggingface.co/datasets/Lin-Chen/ShareGPT4V) dataset into Russian. |
CyberHarem/toujou_nozomi_lovelive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of toujou_nozomi/東條希 (Love Live!)
This is the dataset of toujou_nozomi/東條希 (Love Live!), containing 500 images and their tags.
The core tags of this character are `purple_hair, long_hair, green_eyes, breasts, twintails, large_breasts, low_twintails, hair_ornament, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 708.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/toujou_nozomi_lovelive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 400.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/toujou_nozomi_lovelive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1208 | 840.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/toujou_nozomi_lovelive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 622.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/toujou_nozomi_lovelive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1208 | 1.16 GiB | [Download](https://huggingface.co/datasets/CyberHarem/toujou_nozomi_lovelive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/toujou_nozomi_lovelive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, smile, solo, looking_at_viewer, white_dress, blush, flower, wedding_dress, bouquet, cleavage, elbow_gloves, jewelry, bare_shoulders, bridal_veil, garter_straps, open_mouth, thighhighs, tiara, white_gloves |
| 1 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, blush, skirt, hair_flower, navel, twin_braids, aqua_eyes, black_thighhighs, card, cleavage, dated, earrings, frills, holding, ribbon, very_long_hair |
| 2 | 14 |  |  |  |  |  | 1girl, blue_skirt, blush, looking_at_viewer, otonokizaka_school_uniform, pleated_skirt, solo, white_shirt, plaid_skirt, collared_shirt, hair_scrunchie, green_bowtie, striped_bowtie, pink_scrunchie, simple_background, smile, short_sleeves, summer_uniform, white_background, miniskirt, sweater_vest, black_thighhighs, long_sleeves, winter_uniform, zettai_ryouiki |
| 3 | 8 |  |  |  |  |  | 1girl, blazer, otonokizaka_school_uniform, solo, striped_bowtie, upper_body, winter_uniform, smile, blush, looking_at_viewer, pink_scrunchie, green_bowtie, long_sleeves, blue_jacket, collared_shirt, white_shirt |
| 4 | 40 |  |  |  |  |  | 1girl, solo, hair_flower, looking_at_viewer, crown, dress, smile, single_braid, vines, bare_shoulders, hair_over_shoulder, very_long_hair, thighhighs |
| 5 | 6 |  |  |  |  |  | 1girl, braid, solo, blush, looking_at_viewer, smile, hair_over_shoulder, mini_top_hat, open_mouth, skirt |
| 6 | 7 |  |  |  |  |  | 1girl, blush, looking_at_viewer, smile, solo, frills, white_gloves, aqua_eyes, earrings, parted_bangs, choker, idol, purple_dress, scrunchie, sparkle, bow, collarbone, maid_headdress, skirt, stage, star_(symbol) |
| 7 | 6 |  |  |  |  |  | 1girl, cleavage, solo, heart, looking_at_viewer, maid_headdress, blush, smile, thighhighs, very_long_hair |
| 8 | 12 |  |  |  |  |  | 1girl, solo, cloud, day, looking_at_viewer, ocean, outdoors, navel, smile, beach, blush, cleavage, frilled_bikini, hair_flower, open_mouth, blue_sky, bracelet, collarbone |
| 9 | 7 |  |  |  |  |  | 1girl, hakama_skirt, looking_at_viewer, miko, red_hakama, solo, smile, blush, wide_sleeves, outdoors, aqua_eyes, holding_broom, kimono, open_mouth, ribbon, shrine |
| 10 | 5 |  |  |  |  |  | 1girl, elbow_gloves, witch_hat, black_gloves, black_thighhighs, solo, high_heels, looking_at_viewer, smile, star_hair_ornament, aqua_eyes, capelet, card, cleavage, dress, halloween, ribbon, skirt |
| 11 | 6 |  |  |  |  |  | 1girl, kimono, looking_at_viewer, smile, solo, aqua_eyes, floral_print, obi, blush, wide_sleeves, alternate_hairstyle, hair_bow, hair_flower |
| 12 | 5 |  |  |  |  |  | 1girl, blush, navel, solo, cleavage, looking_at_viewer, underwear_only, very_long_hair, pink_bra, pink_panties, smile, thigh_gap, white_bra |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | solo | looking_at_viewer | white_dress | blush | flower | wedding_dress | bouquet | cleavage | elbow_gloves | jewelry | bare_shoulders | bridal_veil | garter_straps | open_mouth | thighhighs | tiara | white_gloves | skirt | hair_flower | navel | twin_braids | aqua_eyes | black_thighhighs | card | dated | earrings | frills | holding | ribbon | very_long_hair | blue_skirt | otonokizaka_school_uniform | pleated_skirt | white_shirt | plaid_skirt | collared_shirt | hair_scrunchie | green_bowtie | striped_bowtie | pink_scrunchie | simple_background | short_sleeves | summer_uniform | white_background | miniskirt | sweater_vest | long_sleeves | winter_uniform | zettai_ryouiki | blazer | upper_body | blue_jacket | crown | dress | single_braid | vines | hair_over_shoulder | braid | mini_top_hat | parted_bangs | choker | idol | purple_dress | scrunchie | sparkle | bow | collarbone | maid_headdress | stage | star_(symbol) | heart | cloud | day | ocean | outdoors | beach | frilled_bikini | blue_sky | bracelet | hakama_skirt | miko | red_hakama | wide_sleeves | holding_broom | kimono | shrine | witch_hat | black_gloves | high_heels | star_hair_ornament | capelet | halloween | floral_print | obi | alternate_hairstyle | hair_bow | underwear_only | pink_bra | pink_panties | thigh_gap | white_bra |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:-------|:--------------------|:--------------|:--------|:---------|:----------------|:----------|:-----------|:---------------|:----------|:-----------------|:--------------|:----------------|:-------------|:-------------|:--------|:---------------|:--------|:--------------|:--------|:--------------|:------------|:-------------------|:-------|:--------|:-----------|:---------|:----------|:---------|:-----------------|:-------------|:-----------------------------|:----------------|:--------------|:--------------|:-----------------|:-----------------|:---------------|:-----------------|:-----------------|:--------------------|:----------------|:-----------------|:-------------------|:------------|:---------------|:---------------|:-----------------|:-----------------|:---------|:-------------|:--------------|:--------|:--------|:---------------|:--------|:---------------------|:--------|:---------------|:---------------|:---------|:-------|:---------------|:------------|:----------|:------|:-------------|:-----------------|:--------|:----------------|:--------|:--------|:------|:--------|:-----------|:--------|:-----------------|:-----------|:-----------|:---------------|:-------|:-------------|:---------------|:----------------|:---------|:---------|:------------|:---------------|:-------------|:---------------------|:----------|:------------|:---------------|:------|:----------------------|:-----------|:-----------------|:-----------|:---------------|:------------|:------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | | X | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | X | | X | X | X | | | | | | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 40 |  |  |  |  |  | X | X | X | X | | | | | | | | | X | | | | X | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | X | X | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | X | X | | X | | | | X | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 12 |  |  |  |  |  | X | X | X | X | | X | | | | X | | | | | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | X | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 10 | 5 |  |  |  |  |  | X | X | X | X | | | | | | X | X | | | | | | | | | X | | | | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | |
| 11 | 6 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | X | X | X | X | | | | | |
| 12 | 5 |  |  |  |  |  | X | X | X | X | | X | | | | X | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X |
|
AdapterOcean/med_alpaca_standardized_cluster_43_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 6296550
num_examples: 3767
download_size: 3121100
dataset_size: 6296550
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_43_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ndxbxrme/audio-diffusion-256-isolated-drums | ---
dataset_info:
features:
- name: image
dtype: image
- name: audio_file
dtype: string
- name: slice
dtype: int16
splits:
- name: train
num_bytes: 367170599.374
num_examples: 8589
download_size: 366838959
dataset_size: 367170599.374
---
# Dataset Card for "audio-diffusion-256-isolated-drums"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JohnDoe70/QuestionToContextSummarizationv2 | ---
dataset_info:
features:
- name: summary
dtype: string
- name: document
dtype: string
splits:
- name: train
num_bytes: 781136
num_examples: 3597
- name: validation
num_bytes: 99355
num_examples: 450
- name: test
num_bytes: 98805
num_examples: 450
download_size: 446273
dataset_size: 979296
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Star3073/Interview_Data_train | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 87901359
num_examples: 68075
download_size: 43102881
dataset_size: 87901359
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MaryamAlAli/my_dataset_test | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 3790023517.22
num_examples: 1588
download_size: 3217020314
dataset_size: 3790023517.22
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "my_dataset_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mesolitica/pseudolabel-malaya-speech-stt-train-whisper-large-v3 | ---
task_categories:
- automatic-speech-recognition
language:
- ms
--- |
stable-bias/stable-bias_grounding-images_multimodel_3_12_22_clusters | ---
dataset_info:
features:
- name: examplar
dtype: image
- name: centroid
sequence: float64
- name: gender_phrases
sequence: string
- name: gender_phrases_counts
sequence: int64
- name: ethnicity_phrases
sequence: string
- name: ethnicity_phrases_counts
sequence: int64
- name: example_ids
sequence: int64
splits:
- name: 12_clusters
num_bytes: 473144
num_examples: 12
- name: 24_clusters
num_bytes: 940348
num_examples: 24
- name: 48_clusters
num_bytes: 1990487
num_examples: 48
download_size: 3509518
dataset_size: 3403979
license: apache-2.0
---
# Dataset Card for "stable-bias_grounding-images_multimodel_3_12_22_clusters"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
oddadmix/colours-text-to-hex-en-ar | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 8363451
num_examples: 54218
- name: test
num_bytes: 2089471
num_examples: 13556
- name: validation
num_bytes: 417290
num_examples: 2712
download_size: 4351803
dataset_size: 10870212
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_MaziyarPanahi__Calme-12B-Instruct-v0.1 | ---
pretty_name: Evaluation run of MaziyarPanahi/Calme-12B-Instruct-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/Calme-12B-Instruct-v0.1](https://huggingface.co/MaziyarPanahi/Calme-12B-Instruct-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__Calme-12B-Instruct-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-07T11:58:42.351838](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Calme-12B-Instruct-v0.1/blob/main/results_2024-04-07T11-58-42.351838.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6395721812467401,\n\
\ \"acc_stderr\": 0.032545465665076455,\n \"acc_norm\": 0.6416350186157972,\n\
\ \"acc_norm_stderr\": 0.03321258498717781,\n \"mc1\": 0.5593635250917993,\n\
\ \"mc1_stderr\": 0.017379697555437442,\n \"mc2\": 0.7170955842280968,\n\
\ \"mc2_stderr\": 0.014876105209431856\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6689419795221843,\n \"acc_stderr\": 0.01375206241981783,\n\
\ \"acc_norm\": 0.6885665529010239,\n \"acc_norm_stderr\": 0.013532472099850945\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7143995220075682,\n\
\ \"acc_stderr\": 0.004507768029590103,\n \"acc_norm\": 0.8856801433977295,\n\
\ \"acc_norm_stderr\": 0.0031754904136944186\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901409,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901409\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395269,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395269\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.0286372356398009,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.0286372356398009\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.041657747757287644,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.041657747757287644\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.025542846817400513,\n \"\
acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.025542846817400513\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.5317460317460317,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"\
acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313033,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313033\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.034063153607115086,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.034063153607115086\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7219730941704036,\n\
\ \"acc_stderr\": 0.030069584874494043,\n \"acc_norm\": 0.7219730941704036,\n\
\ \"acc_norm_stderr\": 0.030069584874494043\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.0384985609879409,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.0384985609879409\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n\
\ \"acc_stderr\": 0.014179171373424384,\n \"acc_norm\": 0.8045977011494253,\n\
\ \"acc_norm_stderr\": 0.014179171373424384\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.025624723994030454,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.025624723994030454\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4111731843575419,\n\
\ \"acc_stderr\": 0.016456498033977515,\n \"acc_norm\": 0.4111731843575419,\n\
\ \"acc_norm_stderr\": 0.016456498033977515\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046637,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046637\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4810951760104302,\n\
\ \"acc_stderr\": 0.012761104871472657,\n \"acc_norm\": 0.4810951760104302,\n\
\ \"acc_norm_stderr\": 0.012761104871472657\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.01911721391149515,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.01911721391149515\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252089,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252089\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5593635250917993,\n\
\ \"mc1_stderr\": 0.017379697555437442,\n \"mc2\": 0.7170955842280968,\n\
\ \"mc2_stderr\": 0.014876105209431856\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222811\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5125094768764216,\n \
\ \"acc_stderr\": 0.01376817361508785\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/Calme-12B-Instruct-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|arc:challenge|25_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|gsm8k|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hellaswag|10_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T11-58-42.351838.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T11-58-42.351838.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- '**/details_harness|winogrande|5_2024-04-07T11-58-42.351838.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-07T11-58-42.351838.parquet'
- config_name: results
data_files:
- split: 2024_04_07T11_58_42.351838
path:
- results_2024-04-07T11-58-42.351838.parquet
- split: latest
path:
- results_2024-04-07T11-58-42.351838.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/Calme-12B-Instruct-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/Calme-12B-Instruct-v0.1](https://huggingface.co/MaziyarPanahi/Calme-12B-Instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__Calme-12B-Instruct-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-07T11:58:42.351838](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Calme-12B-Instruct-v0.1/blob/main/results_2024-04-07T11-58-42.351838.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6395721812467401,
"acc_stderr": 0.032545465665076455,
"acc_norm": 0.6416350186157972,
"acc_norm_stderr": 0.03321258498717781,
"mc1": 0.5593635250917993,
"mc1_stderr": 0.017379697555437442,
"mc2": 0.7170955842280968,
"mc2_stderr": 0.014876105209431856
},
"harness|arc:challenge|25": {
"acc": 0.6689419795221843,
"acc_stderr": 0.01375206241981783,
"acc_norm": 0.6885665529010239,
"acc_norm_stderr": 0.013532472099850945
},
"harness|hellaswag|10": {
"acc": 0.7143995220075682,
"acc_stderr": 0.004507768029590103,
"acc_norm": 0.8856801433977295,
"acc_norm_stderr": 0.0031754904136944186
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901409,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901409
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.0286372356398009,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.0286372356398009
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.041657747757287644,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.041657747757287644
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.025542846817400513,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.025542846817400513
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313033,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313033
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.034063153607115086,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.034063153607115086
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7219730941704036,
"acc_stderr": 0.030069584874494043,
"acc_norm": 0.7219730941704036,
"acc_norm_stderr": 0.030069584874494043
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.0384985609879409,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.0384985609879409
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424384,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424384
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.025624723994030454,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.025624723994030454
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4111731843575419,
"acc_stderr": 0.016456498033977515,
"acc_norm": 0.4111731843575419,
"acc_norm_stderr": 0.016456498033977515
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046637,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046637
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4810951760104302,
"acc_stderr": 0.012761104871472657,
"acc_norm": 0.4810951760104302,
"acc_norm_stderr": 0.012761104871472657
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.01911721391149515,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.01911721391149515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252089,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252089
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5593635250917993,
"mc1_stderr": 0.017379697555437442,
"mc2": 0.7170955842280968,
"mc2_stderr": 0.014876105209431856
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222811
},
"harness|gsm8k|5": {
"acc": 0.5125094768764216,
"acc_stderr": 0.01376817361508785
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
wisenut-nlp-team/Open-LLM-Benchmark | ---
configs:
- config_name: ARC
data_files:
- split: test
path: data/Ko_ARC_instruction.csv
- config_name: HellaSwag
data_files:
- split: test
path: data/Ko_HellaSwag_instruction.csv
- config_name: MMLU
data_files:
- split: test
path: data/Ko_MMLU_instruction.csv
- config_name: TruthfulQA
data_files:
- split: test
path: data/Ko_TruthfulQA_instruction.csv
- config_name: Grammar
data_files:
- split: test
path: data/Ko_Grammar_instruction.csv
- config_name: HateSpeech
data_files:
- split: test
path: data/Ko_HateSpeech_instruction.csv
- config_name: GeneralKnowledge
data_files:
- split: test
path: data/Ko_GeneralKnowledge_instruction.csv
- config_name: SentenceGen
data_files:
- split: test
path: data/Ko_SentenceGen_instruction.csv
---
### Dataset Statistics
| Category | # Questions |
|------------------------------|-------------|
| **ARC** | 2,590 |
| **HellaSwag** | 3,029 |
| **MMLU** | 4,329 |
| **TruthfulQA** | 1,634 |
| **Kor-CommonGEN** | |
| Grammar | 2,950 |
| HateSpeech | 2,251 |
| GeneralKnowledge | 4,900 |
| SentenceGen | 2,500 |
| **Total** | **24,183** | |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/1b6266d7 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1341
dataset_size: 178
---
# Dataset Card for "1b6266d7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/global_street_style_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 190230
num_examples: 1000
download_size: 24159
dataset_size: 190230
---
# Dataset Card for "global_street_style_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RIPS-Goog-23/FUNSD | ---
dataset_info:
features:
- name: pixel_values
dtype:
array3_d:
shape:
- 3
- 224
- 224
dtype: float32
- name: input_ids
sequence: int64
- name: attention_mask
sequence: int64
- name: bbox
dtype:
array2_d:
shape:
- 512
- 4
dtype: int64
- name: labels
sequence: int64
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-HEADER
'2': I-HEADER
'3': B-QUESTION
'4': I-QUESTION
'5': B-ANSWER
'6': I-ANSWER
splits:
- name: test
num_bytes: 31847456
num_examples: 50
- name: train
num_bytes: 94872948
num_examples: 149
download_size: 5136310
dataset_size: 126720404
---
# Dataset Card for "FUNSD"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carlosejimenez/seq2seq-cola | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
- name: orig_idx
dtype: int64
splits:
- name: train
num_bytes: 703700
num_examples: 8551
- name: validation
num_bytes: 87041
num_examples: 1043
- name: test
num_bytes: 86025
num_examples: 1063
download_size: 0
dataset_size: 876766
---
# Dataset Card for "seq2seq-cola"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
syedjameerbabu/mini-platypus-instruction-dataset-for-finetuning | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245921
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cookinai/kugelblitz-alpha-v0.1 | ---
license: cc-by-nc-4.0
---
# Kugelblitz Alpha
# Experimental, my first dataset may have bugs
Combination of 6 High Quality Huggingface Datasets,
Had my eye on these datasets for a while and thought they were intresting, so I decided to combine them into one.
- [OpenHermes](https://huggingface.co/datasets/teknium/openhermes) Very Preformant Dataset, Well Known
- [Jon Durbin's Bagel](https://huggingface.co/datasets/jondurbin/bagel-v0.3) Merge of Plenty of Datasets
- [Hercules-3.0](https://huggingface.co/datasets/Locutusque/Hercules-v3.0) Another Dataset Merge, Scores High on Leaderboards
- [Cosmopedia-100K](https://huggingface.co/datasets/HuggingFaceTB/cosmopedia-100k) 100K Sampling of a Mixtral Synthetic Dataset
- [Slimorca](https://huggingface.co/datasets/Open-Orca/SlimOrca-Dedup) Good Quality Dataset
- [Samantha](digitalpipelines/samantha-1.1-uncensored) Small Sampling of The Samantha Dataset, For a Less Robotic Model
 |
tastypear/unalignment-toxic-dpo-v0.2-zh_cn | ---
license: cc-by-4.0
tags:
- not-for-all-audiences
language:
- zh
task_categories:
- conversational
---
数据集 unalignment/toxic-dpo-v0.2 的中英文对照版本。
这是一个高度有害的数据集,旨在通过很少的示例来说明如何使用 DPO 轻松地对模型进行去审查/取消对齐。
这份对照版本的中文来自多个不同模型的意译。转换的过程中,模型被允许对结果进行演绎以求通顺,无法对结果的准确性作任何保证。
使用限制请参照原数据集的 Usage restriction。
---
# Original Dataset Description:
## Toxic-DPO
This is a highly toxic, "harmful" dataset meant to illustrate how DPO can be used to de-censor/unalign a model quite easily using direct-preference-optimization (DPO) using very few examples.
Many of the examples still contain some amount of warnings/disclaimers, so it's still somewhat editorialized.
## Usage restriction
To use this data, you must acknowledge/agree to the following:
- data contained within is "toxic"/"harmful", and contains profanity and other types of sensitive content
- none of the content or views contained in the dataset necessarily align with my personal beliefs or opinions, they are simply text generated by LLMs automatically
- you are able to use the dataset lawfully, particularly in locations with less-than-free speech laws
- you, and you alone are responsible for having downloaded and used the dataset, and I am completely indemnified from any and all liabilities
This dataset is meant __*exclusively*__ for academic/research or other non-nefarious use-cases. |
open-llm-leaderboard/details_rombodawg__Everyone-Coder-4x7b-Base | ---
pretty_name: Evaluation run of rombodawg/Everyone-Coder-4x7b-Base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rombodawg/Everyone-Coder-4x7b-Base](https://huggingface.co/rombodawg/Everyone-Coder-4x7b-Base)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rombodawg__Everyone-Coder-4x7b-Base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-15T17:47:56.627468](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__Everyone-Coder-4x7b-Base/blob/main/results_2024-01-15T17-47-56.627468.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6447898132540958,\n\
\ \"acc_stderr\": 0.031915985387073305,\n \"acc_norm\": 0.6461876134084575,\n\
\ \"acc_norm_stderr\": 0.03255592718009434,\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.016571797910626615,\n \"mc2\": 0.49160643723765735,\n\
\ \"mc2_stderr\": 0.015188709391608397\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6117747440273038,\n \"acc_stderr\": 0.014241614207414046,\n\
\ \"acc_norm\": 0.6450511945392492,\n \"acc_norm_stderr\": 0.013983036904094087\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6623182632941645,\n\
\ \"acc_stderr\": 0.004719529099913131,\n \"acc_norm\": 0.8481378211511651,\n\
\ \"acc_norm_stderr\": 0.0035815378475817965\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.0436031486007746,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.0436031486007746\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n\
\ \"acc_stderr\": 0.024362599693031096,\n \"acc_norm\": 0.7580645161290323,\n\
\ \"acc_norm_stderr\": 0.024362599693031096\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.031584153240477114,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.031584153240477114\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062153,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062153\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465076,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465076\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.01646534546739154,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.01646534546739154\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057222,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057222\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069356,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069356\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n\
\ \"acc_stderr\": 0.015748421208187303,\n \"acc_norm\": 0.3318435754189944,\n\
\ \"acc_norm_stderr\": 0.015748421208187303\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959603,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959603\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45827900912646674,\n\
\ \"acc_stderr\": 0.01272570165695364,\n \"acc_norm\": 0.45827900912646674,\n\
\ \"acc_norm_stderr\": 0.01272570165695364\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174923,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174923\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466136,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466136\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.016571797910626615,\n \"mc2\": 0.49160643723765735,\n\
\ \"mc2_stderr\": 0.015188709391608397\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987729\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6345716451857468,\n \
\ \"acc_stderr\": 0.013264282030266635\n }\n}\n```"
repo_url: https://huggingface.co/rombodawg/Everyone-Coder-4x7b-Base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|arc:challenge|25_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|arc:challenge|25_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|gsm8k|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|gsm8k|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hellaswag|10_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hellaswag|10_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T02-37-27.677232.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T17-47-56.627468.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T17-47-56.627468.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- '**/details_harness|winogrande|5_2024-01-15T02-37-27.677232.parquet'
- split: 2024_01_15T17_47_56.627468
path:
- '**/details_harness|winogrande|5_2024-01-15T17-47-56.627468.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-15T17-47-56.627468.parquet'
- config_name: results
data_files:
- split: 2024_01_15T02_37_27.677232
path:
- results_2024-01-15T02-37-27.677232.parquet
- split: 2024_01_15T17_47_56.627468
path:
- results_2024-01-15T17-47-56.627468.parquet
- split: latest
path:
- results_2024-01-15T17-47-56.627468.parquet
---
# Dataset Card for Evaluation run of rombodawg/Everyone-Coder-4x7b-Base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rombodawg/Everyone-Coder-4x7b-Base](https://huggingface.co/rombodawg/Everyone-Coder-4x7b-Base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rombodawg__Everyone-Coder-4x7b-Base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T17:47:56.627468](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__Everyone-Coder-4x7b-Base/blob/main/results_2024-01-15T17-47-56.627468.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6447898132540958,
"acc_stderr": 0.031915985387073305,
"acc_norm": 0.6461876134084575,
"acc_norm_stderr": 0.03255592718009434,
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626615,
"mc2": 0.49160643723765735,
"mc2_stderr": 0.015188709391608397
},
"harness|arc:challenge|25": {
"acc": 0.6117747440273038,
"acc_stderr": 0.014241614207414046,
"acc_norm": 0.6450511945392492,
"acc_norm_stderr": 0.013983036904094087
},
"harness|hellaswag|10": {
"acc": 0.6623182632941645,
"acc_stderr": 0.004719529099913131,
"acc_norm": 0.8481378211511651,
"acc_norm_stderr": 0.0035815378475817965
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778408,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778408
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0436031486007746,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0436031486007746
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031096,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031096
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.031584153240477114,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.031584153240477114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062153,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062153
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465076,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465076
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.01646534546739154,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.01646534546739154
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057222,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057222
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097652,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097652
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281382,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069356,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069356
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187303,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187303
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959603,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959603
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45827900912646674,
"acc_stderr": 0.01272570165695364,
"acc_norm": 0.45827900912646674,
"acc_norm_stderr": 0.01272570165695364
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174923,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174923
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466136,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466136
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626615,
"mc2": 0.49160643723765735,
"mc2_stderr": 0.015188709391608397
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987729
},
"harness|gsm8k|5": {
"acc": 0.6345716451857468,
"acc_stderr": 0.013264282030266635
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/yukino_yukinoshita_yahariorenoseishunlovecomewamachigatteiru | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Yukino Yukinoshita (Yahari Ore no Seishun LoveCome wa Machigatte Iru)
This is the dataset of Yukino Yukinoshita (Yahari Ore no Seishun LoveCome wa Machigatte Iru), containing 998 images and their tags.
The core tags of this character are `black_hair, long_hair, ribbon, blue_eyes, hair_ribbon, red_ribbon, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 998 | 494.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yukino_yukinoshita_yahariorenoseishunlovecomewamachigatteiru/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 998 | 421.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yukino_yukinoshita_yahariorenoseishunlovecomewamachigatteiru/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 2124 | 837.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yukino_yukinoshita_yahariorenoseishunlovecomewamachigatteiru/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 998 | 494.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yukino_yukinoshita_yahariorenoseishunlovecomewamachigatteiru/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 2124 | 943.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yukino_yukinoshita_yahariorenoseishunlovecomewamachigatteiru/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yukino_yukinoshita_yahariorenoseishunlovecomewamachigatteiru',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, collared_shirt, neck_ribbon, sobu_high_school_uniform, solo, upper_body, white_shirt, hair_between_eyes, looking_at_viewer, indoors, sunset, window |
| 1 | 8 |  |  |  |  |  | 1girl, black_jacket, blazer, collared_shirt, neck_ribbon, sobu_high_school_uniform, solo, white_shirt, looking_at_viewer, upper_body, hair_between_eyes |
| 2 | 8 |  |  |  |  |  | 1girl, black_jacket, blazer, neck_ribbon, sobu_high_school_uniform, solo, upper_body, white_shirt, ahoge, collared_shirt, closed_mouth, hair_between_eyes |
| 3 | 6 |  |  |  |  |  | 1girl, black_jacket, blazer, looking_at_viewer, sobu_high_school_uniform, solo |
| 4 | 10 |  |  |  |  |  | 1girl, black_jacket, blazer, open_mouth, sobu_high_school_uniform, solo, looking_at_viewer |
| 5 | 11 |  |  |  |  |  | 1girl, black_jacket, blazer, sobu_high_school_uniform, solo |
| 6 | 12 |  |  |  |  |  | 1girl, black_jacket, blazer, closed_eyes, sobu_high_school_uniform, solo |
| 7 | 5 |  |  |  |  |  | 1girl, black_jacket, blazer, profile, sobu_high_school_uniform, solo, sunset |
| 8 | 5 |  |  |  |  |  | 1girl, black_jacket, blazer, bookshelf, sobu_high_school_uniform, solo, glasses |
| 9 | 5 |  |  |  |  |  | 1girl, black_jacket, blazer, shirt, sobu_high_school_uniform, solo, upper_body |
| 10 | 6 |  |  |  |  |  | 1girl, ahoge, black_jacket, blazer, closed_eyes, sobu_high_school_uniform, solo, teacup |
| 11 | 10 |  |  |  |  |  | 1girl, black_jacket, black_thighhighs, blazer, skirt, sobu_high_school_uniform, solo, zettai_ryouiki, ahoge |
| 12 | 35 |  |  |  |  |  | 1girl, sobu_high_school_uniform, plaid_skirt, pleated_skirt, black_jacket, blazer, solo, white_shirt, neck_ribbon, long_sleeves, collared_shirt, black_thighhighs, zettai_ryouiki |
| 13 | 16 |  |  |  |  |  | 1girl, solo, sobu_high_school_uniform, shirt, apron |
| 14 | 8 |  |  |  |  |  | 1girl, sobu_high_school_uniform, solo, looking_at_viewer, shirt, ahoge, sweater_vest, crossed_arms |
| 15 | 11 |  |  |  |  |  | 1girl, hair_between_eyes, portrait, solo, close-up, parody, anime_coloring, open_mouth, closed_mouth, looking_at_viewer |
| 16 | 5 |  |  |  |  |  | reading, sitting, sobu_high_school_uniform, sweater_vest, 1girl, plaid_skirt, solo, holding_book, shirt, thighhighs, window, zettai_ryouiki |
| 17 | 5 |  |  |  |  |  | 1girl, blue_necktie, formal, solo, suit, upper_body, white_shirt, black_jacket, looking_at_viewer, ponytail, smile, hair_between_eyes, sidelocks, ahoge |
| 18 | 5 |  |  |  |  |  | 1girl, ahoge, black_pants, blue_necktie, formal, ponytail, sidelocks, suit, vest, white_gloves, white_shirt, solo, standing, black_jacket, hair_between_eyes, butler, closed_mouth, looking_at_viewer |
| 19 | 8 |  |  |  |  |  | 1girl, necklace, shirt, solo, upper_body, tree |
| 20 | 7 |  |  |  |  |  | 1girl, blue_ribbon, neck_ribbon, outdoors, white_shirt, collared_shirt, disposable_cup, drinking_straw, hair_between_eyes, upper_body, blue_cardigan, holding_cup, low_twintails, ahoge, bubble_tea, day, looking_at_viewer, open_mouth, school_uniform, sky, solo, blurry, blush, cloud, teeth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | collared_shirt | neck_ribbon | sobu_high_school_uniform | solo | upper_body | white_shirt | hair_between_eyes | looking_at_viewer | indoors | sunset | window | black_jacket | blazer | ahoge | closed_mouth | open_mouth | closed_eyes | profile | bookshelf | glasses | shirt | teacup | black_thighhighs | skirt | zettai_ryouiki | plaid_skirt | pleated_skirt | long_sleeves | apron | sweater_vest | crossed_arms | portrait | close-up | parody | anime_coloring | reading | sitting | holding_book | thighhighs | blue_necktie | formal | suit | ponytail | smile | sidelocks | black_pants | vest | white_gloves | standing | butler | necklace | tree | blue_ribbon | outdoors | disposable_cup | drinking_straw | blue_cardigan | holding_cup | low_twintails | bubble_tea | day | school_uniform | sky | blurry | blush | cloud | teeth |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------------|:--------------|:---------------------------|:-------|:-------------|:--------------|:--------------------|:--------------------|:----------|:---------|:---------|:---------------|:---------|:--------|:---------------|:-------------|:--------------|:----------|:------------|:----------|:--------|:---------|:-------------------|:--------|:-----------------|:--------------|:----------------|:---------------|:--------|:---------------|:---------------|:-----------|:-----------|:---------|:-----------------|:----------|:----------|:---------------|:-------------|:---------------|:---------|:-------|:-----------|:--------|:------------|:--------------|:-------|:---------------|:-----------|:---------|:-----------|:-------|:--------------|:-----------|:-----------------|:-----------------|:----------------|:--------------|:----------------|:-------------|:------|:-----------------|:------|:---------|:--------|:--------|:--------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | X | X | | | | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | | | X | X | | | | X | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 11 |  |  |  |  |  | X | | | X | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 12 |  |  |  |  |  | X | | | X | X | | | | | | | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | | X | X | | | | | | X | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | X | | | X | X | | | | | | | | X | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | | X | X | X | | | | | | | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 6 |  |  |  |  |  | X | | | X | X | | | | | | | | X | X | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 11 | 10 |  |  |  |  |  | X | | | X | X | | | | | | | | X | X | X | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 12 | 35 |  |  |  |  |  | X | X | X | X | X | | X | | | | | | X | X | | | | | | | | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 13 | 16 |  |  |  |  |  | X | | | X | X | | | | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 14 | 8 |  |  |  |  |  | X | | | X | X | | | | X | | | | | | X | | | | | | | X | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 15 | 11 |  |  |  |  |  | X | | | | X | | | X | X | | | | | | | X | X | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 16 | 5 |  |  |  |  |  | X | | | X | X | | | | | | | X | | | | | | | | | | X | | | | X | X | | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 17 | 5 |  |  |  |  |  | X | | | | X | X | X | X | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 18 | 5 |  |  |  |  |  | X | | | | X | | X | X | X | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 19 | 8 |  |  |  |  |  | X | | | | X | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | |
| 20 | 7 |  |  |  |  |  | X | X | X | | X | X | X | X | X | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
MikeXydas/ToTTo | ---
license: mit
---
Original repo of dataset: https://github.com/google-research-datasets/ToTTo |
mmajbaig/gpt2-124M-qlora-chat-support | ---
dataset_info:
features:
- name: answer
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 17924
num_examples: 79
download_size: 9896
dataset_size: 17924
---
# Dataset Card for "gpt2-124M-qlora-chat-support"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
charanhu/Kannada-Dataset-v02 | ---
dataset_info:
features:
- name: original_instruction
dtype: string
- name: original_output
dtype: string
- name: translated_instruction
dtype: string
- name: translated_output
dtype: string
splits:
- name: train
num_bytes: 506733958
num_examples: 389608
download_size: 232649343
dataset_size: 506733958
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TuringsSolutions/PFAF3 | ---
license: mit
---
|
wisdomik/Quilt_VQA | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_type
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 225575327
num_examples: 985
download_size: 222944861
dataset_size: 225575327
extra_gated_prompt: >-
Please read and agree to the following terms: 1. The requester details
provided are not faked. 2. The resource will not be used for
commercial/clinical purposes and will be used for scientific research only. 3.
The data will not be re-distributed, published, copied, or further
disseminated in any way or form whatsoever, whether for profit or not. 4. The
right study/paper (Quilt-1M(https://quilt1m.github.io/) and Quilt-LLaVa
(https://quilt-llava.github.io) papers) will be cited in any publication(s)
that uses this model/data
extra_gated_fields:
Email: text
First and last name: text
Affiliation: text
Type of Affiliation:
type: select
options:
- Academia
- Industry
- Other
I want to use this model for:
type: select
options:
- Research
- Education
- label: Other
value: other
I agree to the aforementioned terms of use: checkbox
license: cc-by-nc-nd-3.0
task_categories:
- question-answering
- visual-question-answering
language:
- en
tags:
- medical
- histopathology
- arxiv:2312.04746
pretty_name: Quilt-VQA
size_categories:
- 1K<n<10K
---
# Dataset Card for "Quilt_VQA"
**Paper: Quilt-LLaVA: Visual Instruction Tuning by Extracting Localized Narratives from Open-Source Histopathology Videos**
**Paper or resources for more information:**
https://quilt-llava.github.io/
<p align="center">
<img src="https://quilt-llava.github.io/static/images/quilt_vqa_samples.png" alt="fig2" width="90%"/>
</p>
**Description and Details**
To evaluate Quilt-LLaVA, alongside public VQA pathology datasets, we also generated Quilt-VQA by extracting Q&A dataset from naturally occurring questions/answers given in the videos. With the help of GPT4 and some handcrafted algorithms, we collect a rich evaluation dataset of 1283 Q&A pairs. Top two rows show image-dependent Q&A pairs and bottom two rows show general-knowledge Q&A pairs. The original question posed by the narrator of the video is highlighted in yellow.
**Dataset date:**
QUILT-VQA was collected in November 2023.
**License:**
MIT License;
**Where to send questions or comments about the model:**
https://github.com/quilt-llava/quilt-llava.github.io/issues
**Primary intended uses:**
The primary use of QUILT-VQA is for benchmarking histopathology large multimodal models and chatbots.
**Primary intended users:**
The dataset is intended as a research resource for research communities. We hope that this dataset will enable researchers to better understand and explore the generative capacity of medical large multimodal models
**Citation**
```bibtex
@misc{seyfioglu2023quiltllava,
title={Quilt-LLaVA: Visual Instruction Tuning by Extracting Localized Narratives from Open-Source Histopathology Videos},
author={Mehmet Saygin Seyfioglu and Wisdom O. Ikezogwo and Fatemeh Ghezloo and Ranjay Krishna and Linda Shapiro},
year={2023},
eprint={2312.04746},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
```bibtex
@misc{ikezogwo2023quilt1m,
title={Quilt-1M: One Million Image-Text Pairs for Histopathology},
author={Wisdom Oluchi Ikezogwo and Mehmet Saygin Seyfioglu and Fatemeh Ghezloo and Dylan Stefan Chan Geva and Fatwir Sheikh Mohammed and Pavan Kumar Anand and Ranjay Krishna and Linda Shapiro},
year={2023},
eprint={2306.11207},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
[](https://creativecommons.org/licenses/by-nc/3.0/us/deed.en) [-red.svg)](https://en.wikipedia.org/wiki/MIT_License) [](https://en.wikipedia.org/wiki/MIT_License)
**Usage and License Notices**: The data, code, and model checkpoints are intended and licensed for research use only. They are also subject to additional restrictions dictated by the Terms of Use: QUILT-1M, LLaMA, Vicuna and GPT-4 respectively. The model is made available under CC BY NC 3.0 licence and the data, code under CC BY NC ND 3.0 with additional Data Use Agreement (DUA). The data, code, and model checkpoints may be used for non-commercial purposes and any models trained using the dataset should be used only for research purposes. It is expressly prohibited for models trained on this data to be used in clinical care or for any clinical decision making purposes.
|
FaalSa/f12 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 79711
num_examples: 1
- name: validation
num_bytes: 80191
num_examples: 1
- name: test
num_bytes: 80671
num_examples: 1
download_size: 61575
dataset_size: 240573
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_pansophic__new_model_test3 | ---
pretty_name: Evaluation run of pansophic/new_model_test3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pansophic/new_model_test3](https://huggingface.co/pansophic/new_model_test3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pansophic__new_model_test3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T13:42:44.825644](https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__new_model_test3/blob/main/results_2024-03-01T13-42-44.825644.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.49523259900271716,\n\
\ \"acc_stderr\": 0.034424823333227196,\n \"acc_norm\": 0.49686305906710915,\n\
\ \"acc_norm_stderr\": 0.035130206377194745,\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.46890502869040307,\n\
\ \"mc2_stderr\": 0.01558804842335235\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4786689419795222,\n \"acc_stderr\": 0.014598087973127106,\n\
\ \"acc_norm\": 0.5179180887372014,\n \"acc_norm_stderr\": 0.014602005585490978\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5925114519020116,\n\
\ \"acc_stderr\": 0.0049036288872645354,\n \"acc_norm\": 0.7860983867755427,\n\
\ \"acc_norm_stderr\": 0.00409220139389831\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.4,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.030656748696739435,\n\
\ \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.030656748696739435\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4791666666666667,\n\
\ \"acc_stderr\": 0.04177578950739994,\n \"acc_norm\": 0.4791666666666667,\n\
\ \"acc_norm_stderr\": 0.04177578950739994\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n\
\ \"acc_stderr\": 0.03809342081273958,\n \"acc_norm\": 0.4797687861271676,\n\
\ \"acc_norm_stderr\": 0.03809342081273958\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.031907012423268113,\n\
\ \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.031907012423268113\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31216931216931215,\n \"acc_stderr\": 0.023865206836972592,\n \"\
acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.023865206836972592\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5387096774193548,\n \"acc_stderr\": 0.02835863485983693,\n \"\
acc_norm\": 0.5387096774193548,\n \"acc_norm_stderr\": 0.02835863485983693\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n \"\
acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6121212121212121,\n \"acc_stderr\": 0.038049136539710114,\n\
\ \"acc_norm\": 0.6121212121212121,\n \"acc_norm_stderr\": 0.038049136539710114\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.601010101010101,\n \"acc_stderr\": 0.03488901616852732,\n \"acc_norm\"\
: 0.601010101010101,\n \"acc_norm_stderr\": 0.03488901616852732\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7098445595854922,\n \"acc_stderr\": 0.03275264467791516,\n\
\ \"acc_norm\": 0.7098445595854922,\n \"acc_norm_stderr\": 0.03275264467791516\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4358974358974359,\n \"acc_stderr\": 0.025141801511177498,\n\
\ \"acc_norm\": 0.4358974358974359,\n \"acc_norm_stderr\": 0.025141801511177498\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507382,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507382\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.44537815126050423,\n \"acc_stderr\": 0.0322841062671639,\n \
\ \"acc_norm\": 0.44537815126050423,\n \"acc_norm_stderr\": 0.0322841062671639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7119266055045872,\n \"acc_stderr\": 0.019416445892636025,\n \"\
acc_norm\": 0.7119266055045872,\n \"acc_norm_stderr\": 0.019416445892636025\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6176470588235294,\n \"acc_stderr\": 0.03410785338904719,\n \"\
acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.03410785338904719\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.030685820596610795,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.030685820596610795\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.547085201793722,\n\
\ \"acc_stderr\": 0.03340867501923324,\n \"acc_norm\": 0.547085201793722,\n\
\ \"acc_norm_stderr\": 0.03340867501923324\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4961832061068702,\n \"acc_stderr\": 0.043851623256015534,\n\
\ \"acc_norm\": 0.4961832061068702,\n \"acc_norm_stderr\": 0.043851623256015534\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5337423312883436,\n \"acc_stderr\": 0.03919415545048409,\n\
\ \"acc_norm\": 0.5337423312883436,\n \"acc_norm_stderr\": 0.03919415545048409\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7264957264957265,\n\
\ \"acc_stderr\": 0.02920254015343117,\n \"acc_norm\": 0.7264957264957265,\n\
\ \"acc_norm_stderr\": 0.02920254015343117\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6756066411238825,\n\
\ \"acc_stderr\": 0.01674092904716269,\n \"acc_norm\": 0.6756066411238825,\n\
\ \"acc_norm_stderr\": 0.01674092904716269\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.02688264343402289,\n\
\ \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.02688264343402289\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n\
\ \"acc_stderr\": 0.015461169002371556,\n \"acc_norm\": 0.3094972067039106,\n\
\ \"acc_norm_stderr\": 0.015461169002371556\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.545751633986928,\n \"acc_stderr\": 0.02850980780262659,\n\
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.02850980780262659\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5305466237942122,\n\
\ \"acc_stderr\": 0.028345045864840625,\n \"acc_norm\": 0.5305466237942122,\n\
\ \"acc_norm_stderr\": 0.028345045864840625\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.027815973433878014,\n\
\ \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.027815973433878014\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36524822695035464,\n \"acc_stderr\": 0.02872386385328128,\n \
\ \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.02872386385328128\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36766623207301175,\n\
\ \"acc_stderr\": 0.012314845910071698,\n \"acc_norm\": 0.36766623207301175,\n\
\ \"acc_norm_stderr\": 0.012314845910071698\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4338235294117647,\n \"acc_stderr\": 0.030105636570016636,\n\
\ \"acc_norm\": 0.4338235294117647,\n \"acc_norm_stderr\": 0.030105636570016636\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4673202614379085,\n \"acc_stderr\": 0.020184583359102202,\n \
\ \"acc_norm\": 0.4673202614379085,\n \"acc_norm_stderr\": 0.020184583359102202\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4816326530612245,\n \"acc_stderr\": 0.03198761546763126,\n\
\ \"acc_norm\": 0.4816326530612245,\n \"acc_norm_stderr\": 0.03198761546763126\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6766169154228856,\n\
\ \"acc_stderr\": 0.03307615947979035,\n \"acc_norm\": 0.6766169154228856,\n\
\ \"acc_norm_stderr\": 0.03307615947979035\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.46890502869040307,\n\
\ \"mc2_stderr\": 0.01558804842335235\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7048145224940805,\n \"acc_stderr\": 0.012819410741754765\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.422289613343442,\n \
\ \"acc_stderr\": 0.013605126449611878\n }\n}\n```"
repo_url: https://huggingface.co/pansophic/new_model_test3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|arc:challenge|25_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|gsm8k|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hellaswag|10_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T13-42-44.825644.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T13-42-44.825644.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- '**/details_harness|winogrande|5_2024-03-01T13-42-44.825644.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T13-42-44.825644.parquet'
- config_name: results
data_files:
- split: 2024_03_01T13_42_44.825644
path:
- results_2024-03-01T13-42-44.825644.parquet
- split: latest
path:
- results_2024-03-01T13-42-44.825644.parquet
---
# Dataset Card for Evaluation run of pansophic/new_model_test3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [pansophic/new_model_test3](https://huggingface.co/pansophic/new_model_test3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pansophic__new_model_test3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T13:42:44.825644](https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__new_model_test3/blob/main/results_2024-03-01T13-42-44.825644.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.49523259900271716,
"acc_stderr": 0.034424823333227196,
"acc_norm": 0.49686305906710915,
"acc_norm_stderr": 0.035130206377194745,
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907825,
"mc2": 0.46890502869040307,
"mc2_stderr": 0.01558804842335235
},
"harness|arc:challenge|25": {
"acc": 0.4786689419795222,
"acc_stderr": 0.014598087973127106,
"acc_norm": 0.5179180887372014,
"acc_norm_stderr": 0.014602005585490978
},
"harness|hellaswag|10": {
"acc": 0.5925114519020116,
"acc_stderr": 0.0049036288872645354,
"acc_norm": 0.7860983867755427,
"acc_norm_stderr": 0.00409220139389831
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5433962264150943,
"acc_stderr": 0.030656748696739435,
"acc_norm": 0.5433962264150943,
"acc_norm_stderr": 0.030656748696739435
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4791666666666667,
"acc_stderr": 0.04177578950739994,
"acc_norm": 0.4791666666666667,
"acc_norm_stderr": 0.04177578950739994
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273958,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273958
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.031907012423268113,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.031907012423268113
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31216931216931215,
"acc_stderr": 0.023865206836972592,
"acc_norm": 0.31216931216931215,
"acc_norm_stderr": 0.023865206836972592
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5387096774193548,
"acc_stderr": 0.02835863485983693,
"acc_norm": 0.5387096774193548,
"acc_norm_stderr": 0.02835863485983693
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6121212121212121,
"acc_stderr": 0.038049136539710114,
"acc_norm": 0.6121212121212121,
"acc_norm_stderr": 0.038049136539710114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.601010101010101,
"acc_stderr": 0.03488901616852732,
"acc_norm": 0.601010101010101,
"acc_norm_stderr": 0.03488901616852732
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7098445595854922,
"acc_stderr": 0.03275264467791516,
"acc_norm": 0.7098445595854922,
"acc_norm_stderr": 0.03275264467791516
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4358974358974359,
"acc_stderr": 0.025141801511177498,
"acc_norm": 0.4358974358974359,
"acc_norm_stderr": 0.025141801511177498
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507382,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507382
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.44537815126050423,
"acc_stderr": 0.0322841062671639,
"acc_norm": 0.44537815126050423,
"acc_norm_stderr": 0.0322841062671639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7119266055045872,
"acc_stderr": 0.019416445892636025,
"acc_norm": 0.7119266055045872,
"acc_norm_stderr": 0.019416445892636025
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.03410785338904719,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.03410785338904719
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.030685820596610795,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.030685820596610795
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.547085201793722,
"acc_stderr": 0.03340867501923324,
"acc_norm": 0.547085201793722,
"acc_norm_stderr": 0.03340867501923324
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4961832061068702,
"acc_stderr": 0.043851623256015534,
"acc_norm": 0.4961832061068702,
"acc_norm_stderr": 0.043851623256015534
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5337423312883436,
"acc_stderr": 0.03919415545048409,
"acc_norm": 0.5337423312883436,
"acc_norm_stderr": 0.03919415545048409
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7264957264957265,
"acc_stderr": 0.02920254015343117,
"acc_norm": 0.7264957264957265,
"acc_norm_stderr": 0.02920254015343117
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6756066411238825,
"acc_stderr": 0.01674092904716269,
"acc_norm": 0.6756066411238825,
"acc_norm_stderr": 0.01674092904716269
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.02688264343402289,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.02688264343402289
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3094972067039106,
"acc_stderr": 0.015461169002371556,
"acc_norm": 0.3094972067039106,
"acc_norm_stderr": 0.015461169002371556
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.02850980780262659,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.02850980780262659
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5305466237942122,
"acc_stderr": 0.028345045864840625,
"acc_norm": 0.5305466237942122,
"acc_norm_stderr": 0.028345045864840625
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.02872386385328128,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.02872386385328128
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36766623207301175,
"acc_stderr": 0.012314845910071698,
"acc_norm": 0.36766623207301175,
"acc_norm_stderr": 0.012314845910071698
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4338235294117647,
"acc_stderr": 0.030105636570016636,
"acc_norm": 0.4338235294117647,
"acc_norm_stderr": 0.030105636570016636
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4673202614379085,
"acc_stderr": 0.020184583359102202,
"acc_norm": 0.4673202614379085,
"acc_norm_stderr": 0.020184583359102202
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4816326530612245,
"acc_stderr": 0.03198761546763126,
"acc_norm": 0.4816326530612245,
"acc_norm_stderr": 0.03198761546763126
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6766169154228856,
"acc_stderr": 0.03307615947979035,
"acc_norm": 0.6766169154228856,
"acc_norm_stderr": 0.03307615947979035
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907825,
"mc2": 0.46890502869040307,
"mc2_stderr": 0.01558804842335235
},
"harness|winogrande|5": {
"acc": 0.7048145224940805,
"acc_stderr": 0.012819410741754765
},
"harness|gsm8k|5": {
"acc": 0.422289613343442,
"acc_stderr": 0.013605126449611878
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
teganmosi/mental_health_conversation | ---
license: mit
---
|
brainer/drug_info | ---
dataset_info:
- config_name: default
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '196000001'
'1': '196200043'
'2': '196300001'
'3': '196400099'
'4': '196500004'
'5': '197000037'
'6': '197000049'
'7': '197000050'
'8': '197000053'
'9': '197000079'
'10': '197000102'
'11': '197100097'
'12': '197300021'
'13': '197400039'
'14': '197400059'
'15': '197500015'
'16': '197500016'
'17': '197500285'
'18': '197500541'
'19': '197500654'
'20': '197600065'
'21': '197700025'
'22': '197700049'
'23': '197700120'
'24': '197800027'
'25': '197900544'
'26': '197900575'
'27': '198000054'
'28': '198000158'
'29': '198000160'
'30': '198000170'
'31': '198000572'
'32': '198100012'
'33': '198100015'
'34': '198100119'
'35': '198100257'
'36': '198100428'
'37': '198200048'
'38': '198200049'
'39': '198200323'
'40': '198200325'
'41': '198300064'
'42': '198300065'
'43': '198300096'
'44': '198300142'
'45': '198300174'
'46': '198300343'
'47': '198300393'
'48': '198300476'
'49': '198300605'
'50': '198301052'
'51': '198400185'
'52': '198400399'
'53': '198400475'
'54': '198401033'
'55': '198401161'
'56': '198500041'
'57': '198500049'
'58': '198500050'
'59': '198500125'
'60': '198500241'
'61': '198500245'
'62': '198500384'
'63': '198500515'
'64': '198500612'
'65': '198500634'
'66': '198500715'
'67': '198500718'
'68': '198501126'
'69': '198501220'
'70': '198501325'
'71': '198501456'
'72': '198501820'
'73': '198600058'
'74': '198600114'
'75': '198600161'
'76': '198600235'
'77': '198600470'
'78': '198600661'
'79': '198600674'
'80': '198601102'
'81': '198601223'
'82': '198601878'
'83': '198700096'
'84': '198700405'
'85': '198700476'
'86': '198700477'
'87': '198700535'
'88': '198700537'
'89': '198700667'
'90': '198700731'
'91': '198700733'
'92': '198700736'
'93': '198700737'
'94': '198701132'
'95': '198701478'
'96': '198701520'
'97': '198701523'
'98': '198701583'
'99': '198800150'
'100': '198800153'
'101': '198800154'
'102': '198800445'
'103': '198800619'
'104': '198800622'
'105': '198800788'
'106': '198800791'
'107': '198800901'
'108': '198800902'
'109': '198800911'
'110': '198801052'
'111': '198801525'
'112': '198801531'
'113': '198801937'
'114': '198802273'
'115': '198802349'
'116': '198802355'
'117': '198900123'
'118': '198900125'
'119': '198900129'
'120': '198900223'
'121': '198900263'
'122': '198900630'
'123': '198900711'
'124': '198900817'
'125': '198900881'
'126': '198900993'
'127': '198901021'
'128': '198901206'
'129': '198901207'
'130': '198902026'
'131': '198902101'
'132': '198902108'
'133': '198902158'
'134': '198902844'
'135': '199000074'
'136': '199000165'
'137': '199000352'
'138': '199000530'
'139': '199000568'
'140': '199000820'
'141': '199000983'
'142': '199001080'
'143': '199001166'
'144': '199001735'
'145': '199001917'
'146': '199001919'
'147': '199001973'
'148': '199002022'
'149': '199002349'
'150': '199100118'
'151': '199100476'
'152': '199100626'
'153': '199100628'
'154': '199100632'
'155': '199100636'
'156': '199100838'
'157': '199100923'
'158': '199101034'
'159': '199101166'
'160': '199101229'
'161': '199101291'
'162': '199101298'
'163': '199101995'
'164': '199102137'
'165': '199102138'
'166': '199102384'
'167': '199102409'
'168': '199102447'
'169': '199102449'
'170': '199102476'
'171': '199102956'
'172': '199103287'
'173': '199200232'
'174': '199200236'
'175': '199200261'
'176': '199200476'
'177': '199200490'
'178': '199200629'
'179': '199200633'
'180': '199200653'
'181': '199200792'
'182': '199200869'
'183': '199200870'
'184': '199200876'
'185': '199201002'
'186': '199201006'
'187': '199201045'
'188': '199201046'
'189': '199201154'
'190': '199201155'
'191': '199201214'
'192': '199202074'
'193': '199202334'
'194': '199202495'
'195': '199202562'
'196': '199202565'
'197': '199202572'
'198': '199202944'
'199': '199202946'
'200': '199203272'
'201': '199203279'
'202': '199203487'
'203': '199203520'
'204': '199300175'
'205': '199300208'
'206': '199300225'
'207': '199300319'
'208': '199300325'
'209': '199300516'
'210': '199300692'
'211': '199300758'
'212': '199300815'
'213': '199300927'
'214': '199300928'
'215': '199300997'
'216': '199301111'
'217': '199301180'
'218': '199301193'
'219': '199301673'
'220': '199302061'
'221': '199302106'
'222': '199302107'
'223': '199302108'
'224': '199302175'
'225': '199302198'
'226': '199302663'
'227': '199302666'
'228': '199302671'
'229': '199400095'
'230': '199400114'
'231': '199400191'
'232': '199400362'
'233': '199400392'
'234': '199400488'
'235': '199400494'
'236': '199400513'
'237': '199400514'
'238': '199400540'
'239': '199400625'
'240': '199400630'
'241': '199400635'
'242': '199400682'
'243': '199400695'
'244': '199400696'
'245': '199400704'
'246': '199400710'
'247': '199400823'
'248': '199400868'
'249': '199400905'
'250': '199400946'
'251': '199400948'
'252': '199401165'
'253': '199401504'
'254': '199401625'
'255': '199401734'
'256': '199401754'
'257': '199401768'
'258': '199401770'
'259': '199401998'
'260': '199500043'
'261': '199500183'
'262': '199500439'
'263': '199500466'
'264': '199500470'
'265': '199500549'
'266': '199500630'
'267': '199500842'
'268': '199500845'
'269': '199500850'
'270': '199500959'
'271': '199501023'
'272': '199501072'
'273': '199501073'
'274': '199501234'
'275': '199501564'
'276': '199501728'
'277': '199501734'
'278': '199501735'
'279': '199501736'
'280': '199501738'
'281': '199501743'
'282': '199501748'
'283': '199501749'
'284': '199501905'
'285': '199501906'
'286': '199502137'
'287': '199502153'
'288': '199502155'
'289': '199502167'
'290': '199502192'
'291': '199502215'
'292': '199502217'
'293': '199502223'
'294': '199502224'
'295': '199502225'
'296': '199502428'
'297': '199502575'
'298': '199502582'
'299': '199502585'
'300': '199504100'
'301': '199504101'
'302': '199504272'
'303': '199504273'
'304': '199504276'
'305': '199504332'
'306': '199504367'
'307': '199600151'
'308': '199600158'
'309': '199600246'
'310': '199600267'
'311': '199600367'
'312': '199600597'
'313': '199600616'
'314': '199600625'
'315': '199600977'
'316': '199601062'
'317': '199601139'
'318': '199601167'
'319': '199601484'
'320': '199601520'
'321': '199601533'
'322': '199602013'
'323': '199602015'
'324': '199602019'
'325': '199602021'
'326': '199602022'
'327': '199602023'
'328': '199602029'
'329': '199602031'
'330': '199602290'
'331': '199602404'
'332': '199602533'
'333': '199602566'
'334': '199602745'
'335': '199603160'
'336': '199603453'
'337': '199603455'
'338': '199603458'
'339': '199604928'
'340': '199604979'
'341': '199605127'
'342': '199700160'
'343': '199700176'
'344': '199700389'
'345': '199700584'
'346': '199700734'
'347': '199700738'
'348': '199700745'
'349': '199700798'
'350': '199700808'
'351': '199700809'
'352': '199700833'
'353': '199700840'
'354': '199700883'
'355': '199700887'
'356': '199700909'
'357': '199700916'
'358': '199700918'
'359': '199700919'
'360': '199700978'
'361': '199701009'
'362': '199701010'
'363': '199701011'
'364': '199701012'
'365': '199701033'
'366': '199701054'
'367': '199701063'
'368': '199701064'
'369': '199701065'
'370': '199701131'
'371': '199701225'
'372': '199701226'
'373': '199701227'
'374': '199701265'
'375': '199701271'
'376': '199701295'
'377': '199701385'
'378': '199701392'
'379': '199701393'
'380': '199701398'
'381': '199701401'
'382': '199701408'
'383': '199701414'
'384': '199701416'
'385': '199701432'
'386': '199701437'
'387': '199701525'
'388': '199701668'
'389': '199701845'
'390': '199702105'
'391': '199702106'
'392': '199702117'
'393': '199702118'
'394': '199702126'
'395': '199702163'
'396': '199702164'
'397': '199702165'
'398': '199702177'
'399': '199702351'
'400': '199702352'
'401': '199702380'
'402': '199702412'
'403': '199702566'
'404': '199702588'
'405': '199702829'
'406': '199702839'
'407': '199702846'
'408': '199702918'
'409': '199702953'
'410': '199703049'
'411': '199703126'
'412': '199703129'
'413': '199703432'
'414': '199703566'
'415': '199703570'
'416': '199704687'
'417': '199800085'
'418': '199800086'
'419': '199800142'
'420': '199800471'
'421': '199800503'
'422': '199800519'
'423': '199800751'
'424': '199800785'
'425': '199800903'
'426': '199800976'
'427': '199800979'
'428': '199800990'
'429': '199801011'
'430': '199801012'
'431': '199801099'
'432': '199801158'
'433': '199801165'
'434': '199801260'
'435': '199801263'
'436': '199801266'
'437': '199801267'
'438': '199801290'
'439': '199801292'
'440': '199801298'
'441': '199801524'
'442': '199801625'
'443': '199801634'
'444': '199801640'
'445': '199801649'
'446': '199802092'
'447': '199802093'
'448': '199802138'
'449': '199802175'
'450': '199802288'
'451': '199802289'
'452': '199802611'
'453': '199802620'
'454': '199802675'
'455': '199802769'
'456': '199803014'
'457': '199803062'
'458': '199803425'
'459': '199803684'
'460': '199803957'
'461': '199803958'
'462': '199806739'
'463': '199806820'
'464': '199806914'
'465': '199806978'
'466': '199900114'
'467': '199900120'
'468': '199900163'
'469': '199900203'
'470': '199900204'
'471': '199900351'
'472': '199900554'
'473': '199900672'
'474': '199900832'
'475': '199900834'
'476': '199900915'
'477': '199901076'
'478': '199901116'
'479': '199901211'
'480': '199901215'
'481': '199901218'
'482': '199901248'
'483': '199901258'
'484': '199901558'
'485': '199901692'
'486': '199901718'
'487': '199901759'
'488': '199902270'
'489': '199902295'
'490': '199902440'
'491': '199902448'
'492': '199902449'
'493': '199902459'
'494': '199902661'
'495': '199903179'
'496': '199903219'
'497': '199903222'
'498': '199903365'
'499': '199903836'
'500': '199903837'
'501': '199906840'
'502': '199907419'
'503': '200000209'
'504': '200000210'
'505': '200000220'
'506': '200000223'
'507': '200000224'
'508': '200000417'
'509': '200000559'
'510': '200000569'
'511': '200000601'
'512': '200000740'
'513': '200000919'
'514': '200000930'
'515': '200000932'
'516': '200000933'
'517': '200000934'
'518': '200000935'
'519': '200001172'
'520': '200001184'
'521': '200001186'
'522': '200001190'
'523': '200001275'
'524': '200001430'
'525': '200001578'
'526': '200001604'
'527': '200001858'
'528': '200002121'
'529': '200002422'
'530': '200002738'
'531': '200002760'
'532': '200002776'
'533': '200002787'
'534': '200002882'
'535': '200002887'
'536': '200002949'
'537': '200003080'
'538': '200003085'
'539': '200003086'
'540': '200003109'
'541': '200003111'
'542': '200003403'
'543': '200003442'
'544': '200003541'
'545': '200003560'
'546': '200003792'
'547': '200003994'
'548': '200003999'
'549': '200004038'
'550': '200004243'
'551': '200004640'
'552': '200004643'
'553': '200005087'
'554': '200005100'
'555': '200008568'
'556': '200008571'
'557': '200008952'
'558': '200009059'
'559': '200009061'
'560': '200009062'
'561': '200009063'
'562': '200009333'
'563': '200009496'
'564': '200009524'
'565': '200009634'
'566': '200009708'
'567': '200100102'
'568': '200100264'
'569': '200100383'
'570': '200100424'
'571': '200100429'
'572': '200100602'
'573': '200100614'
'574': '200100671'
'575': '200100725'
'576': '200100827'
'577': '200100858'
'578': '200100960'
'579': '200100997'
'580': '200101016'
'581': '200101017'
'582': '200101018'
'583': '200101022'
'584': '200101103'
'585': '200101123'
'586': '200101150'
'587': '200101162'
'588': '200101179'
'589': '200101249'
'590': '200101393'
'591': '200101407'
'592': '200101428'
'593': '200101451'
'594': '200101472'
'595': '200101633'
'596': '200101635'
'597': '200102266'
'598': '200102615'
'599': '200102622'
'600': '200102623'
'601': '200102644'
'602': '200102802'
'603': '200102845'
'604': '200102846'
'605': '200102887'
'606': '200102909'
'607': '200102920'
'608': '200102931'
'609': '200102998'
'610': '200103187'
'611': '200103882'
'612': '200103884'
'613': '200108384'
'614': '200108800'
'615': '200108817'
'616': '200108822'
'617': '200109139'
'618': '200109233'
'619': '200109769'
'620': '200110053'
'621': '200110059'
'622': '200200143'
'623': '200200152'
'624': '200200158'
'625': '200200169'
'626': '200200173'
'627': '200200201'
'628': '200200234'
'629': '200200252'
'630': '200200375'
'631': '200200377'
'632': '200200486'
'633': '200200502'
'634': '200200611'
'635': '200200663'
'636': '200200715'
'637': '200200719'
'638': '200200720'
'639': '200200723'
'640': '200200852'
'641': '200200939'
'642': '200201028'
'643': '200201029'
'644': '200201084'
'645': '200201087'
'646': '200201201'
'647': '200201329'
'648': '200201335'
'649': '200201339'
'650': '200201406'
'651': '200201524'
'652': '200201895'
'653': '200201906'
'654': '200201919'
'655': '200201926'
'656': '200201927'
'657': '200202092'
'658': '200202093'
'659': '200202362'
'660': '200202373'
'661': '200202476'
'662': '200202573'
'663': '200202617'
'664': '200202753'
'665': '200202755'
'666': '200202781'
'667': '200202782'
'668': '200202982'
'669': '200202994'
'670': '200203031'
'671': '200203496'
'672': '200203727'
'673': '200204308'
'674': '200204322'
'675': '200204348'
'676': '200204369'
'677': '200209794'
'678': '200210513'
'679': '200300089'
'680': '200300092'
'681': '200300095'
'682': '200300100'
'683': '200300102'
'684': '200300103'
'685': '200300104'
'686': '200300144'
'687': '200300154'
'688': '200300155'
'689': '200300192'
'690': '200300198'
'691': '200300260'
'692': '200300269'
'693': '200300283'
'694': '200300284'
'695': '200300286'
'696': '200300312'
'697': '200300313'
'698': '200300406'
'699': '200300408'
'700': '200300409'
'701': '200300423'
'702': '200300424'
'703': '200300431'
'704': '200300434'
'705': '200300458'
'706': '200300459'
'707': '200300461'
'708': '200300462'
'709': '200300463'
'710': '200300470'
'711': '200300520'
'712': '200300521'
'713': '200300523'
'714': '200300598'
'715': '200300603'
'716': '200300626'
'717': '200300638'
'718': '200300642'
'719': '200300645'
'720': '200300657'
'721': '200300660'
'722': '200300661'
'723': '200300694'
'724': '200300708'
'725': '200300784'
'726': '200300792'
'727': '200300796'
'728': '200300800'
'729': '200300801'
'730': '200300802'
'731': '200300804'
'732': '200300805'
'733': '200300806'
'734': '200300840'
'735': '200300843'
'736': '200300854'
'737': '200300855'
'738': '200300864'
'739': '200300878'
'740': '200300905'
'741': '200300913'
'742': '200300916'
'743': '200300917'
'744': '200300918'
'745': '200300984'
'746': '200300985'
'747': '200300987'
'748': '200301018'
'749': '200301019'
'750': '200301021'
'751': '200301031'
'752': '200301032'
'753': '200301033'
'754': '200301128'
'755': '200301131'
'756': '200301172'
'757': '200301181'
'758': '200301252'
'759': '200301323'
'760': '200301345'
'761': '200301348'
'762': '200301486'
'763': '200301507'
'764': '200301609'
'765': '200301610'
'766': '200301612'
'767': '200301618'
'768': '200301620'
'769': '200301624'
'770': '200301625'
'771': '200301667'
'772': '200301683'
'773': '200301739'
'774': '200301759'
'775': '200301795'
'776': '200301831'
'777': '200301937'
'778': '200301953'
'779': '200301956'
'780': '200301990'
'781': '200301991'
'782': '200302009'
'783': '200302015'
'784': '200302039'
'785': '200302049'
'786': '200302050'
'787': '200302051'
'788': '200302052'
'789': '200302053'
'790': '200302054'
'791': '200302073'
'792': '200302121'
'793': '200302122'
'794': '200302141'
'795': '200302187'
'796': '200302205'
'797': '200302214'
'798': '200302215'
'799': '200302254'
'800': '200302452'
'801': '200302456'
'802': '200302457'
'803': '200302476'
'804': '200302482'
'805': '200302484'
'806': '200302485'
'807': '200302486'
'808': '200302623'
'809': '200302627'
'810': '200302642'
'811': '200303280'
'812': '200303281'
'813': '200303328'
'814': '200307647'
'815': '200307781'
'816': '200307798'
'817': '200307799'
'818': '200307847'
'819': '200307848'
'820': '200307866'
'821': '200307867'
'822': '200308353'
'823': '200308354'
'824': '200308550'
'825': '200308552'
'826': '200308578'
'827': '200308603'
'828': '200308647'
'829': '200308774'
'830': '200308825'
'831': '200308841'
'832': '200308855'
'833': '200308867'
'834': '200400062'
'835': '200400100'
'836': '200400102'
'837': '200400104'
'838': '200400105'
'839': '200400111'
'840': '200400112'
'841': '200400113'
'842': '200400114'
'843': '200400118'
'844': '200400120'
'845': '200400121'
'846': '200400122'
'847': '200400123'
'848': '200400165'
'849': '200400166'
'850': '200400173'
'851': '200400175'
'852': '200400177'
'853': '200400179'
'854': '200400215'
'855': '200400323'
'856': '200400325'
'857': '200400387'
'858': '200400388'
'859': '200400392'
'860': '200400395'
'861': '200400397'
'862': '200400463'
'863': '200400465'
'864': '200400478'
'865': '200400482'
'866': '200400514'
'867': '200400519'
'868': '200400520'
'869': '200400528'
'870': '200400530'
'871': '200400538'
'872': '200400550'
'873': '200400561'
'874': '200400565'
'875': '200400570'
'876': '200400580'
'877': '200400581'
'878': '200400587'
'879': '200400644'
'880': '200400685'
'881': '200400686'
'882': '200400687'
'883': '200400691'
'884': '200400695'
'885': '200400700'
'886': '200400718'
'887': '200400724'
'888': '200400727'
'889': '200400728'
'890': '200400729'
'891': '200400798'
'892': '200400799'
'893': '200400801'
'894': '200400802'
'895': '200400803'
'896': '200400804'
'897': '200400813'
'898': '200400816'
'899': '200400817'
'900': '200400825'
'901': '200400828'
'902': '200400830'
'903': '200400841'
'904': '200400872'
'905': '200400894'
'906': '200400904'
'907': '200400910'
'908': '200400922'
'909': '200400930'
'910': '200400993'
'911': '200401005'
'912': '200401009'
'913': '200401062'
'914': '200401108'
'915': '200401172'
'916': '200401202'
'917': '200401210'
'918': '200401211'
'919': '200401219'
'920': '200401221'
'921': '200401222'
'922': '200401235'
'923': '200401242'
'924': '200401244'
'925': '200401247'
'926': '200401254'
'927': '200401323'
'928': '200401324'
'929': '200401325'
'930': '200401326'
'931': '200401327'
'932': '200401328'
'933': '200401329'
'934': '200401330'
'935': '200401332'
'936': '200401353'
'937': '200401359'
'938': '200401367'
'939': '200401369'
'940': '200401370'
'941': '200401371'
'942': '200401373'
'943': '200401383'
'944': '200401385'
'945': '200401397'
'946': '200401399'
'947': '200401401'
'948': '200401405'
'949': '200401407'
'950': '200401409'
'951': '200401411'
'952': '200401432'
'953': '200401547'
'954': '200401548'
'955': '200401578'
'956': '200401582'
'957': '200401586'
'958': '200401589'
'959': '200401651'
'960': '200401652'
'961': '200401666'
'962': '200401683'
'963': '200401684'
'964': '200401686'
'965': '200401687'
'966': '200401693'
'967': '200401718'
'968': '200401747'
'969': '200401750'
'970': '200401751'
'971': '200401755'
'972': '200401756'
'973': '200401762'
'974': '200401769'
'975': '200401848'
'976': '200401849'
'977': '200401850'
'978': '200401852'
'979': '200401857'
'980': '200401862'
'981': '200401874'
'982': '200401875'
'983': '200401876'
'984': '200401878'
'985': '200401881'
'986': '200401882'
'987': '200402344'
'988': '200402368'
'989': '200402374'
'990': '200402391'
'991': '200402393'
'992': '200402396'
'993': '200402399'
'994': '200402407'
'995': '200402408'
'996': '200402420'
'997': '200402435'
'998': '200402439'
'999': '200402455'
'1000': '200402458'
'1001': '200402464'
'1002': '200402465'
'1003': '200402466'
'1004': '200402468'
'1005': '200402485'
'1006': '200402487'
'1007': '200402490'
'1008': '200402508'
'1009': '200402510'
'1010': '200402522'
'1011': '200402557'
'1012': '200402604'
'1013': '200402617'
'1014': '200402623'
'1015': '200402674'
'1016': '200402677'
'1017': '200402681'
'1018': '200402690'
'1019': '200402816'
'1020': '200402819'
'1021': '200402823'
'1022': '200402824'
'1023': '200402825'
'1024': '200402830'
'1025': '200402834'
'1026': '200402835'
'1027': '200402839'
'1028': '200402840'
'1029': '200402841'
'1030': '200402842'
'1031': '200402845'
'1032': '200402847'
'1033': '200402848'
'1034': '200402852'
'1035': '200402853'
'1036': '200402879'
'1037': '200402902'
'1038': '200402903'
'1039': '200402912'
'1040': '200402913'
'1041': '200402915'
'1042': '200402927'
'1043': '200402928'
'1044': '200402929'
'1045': '200402931'
'1046': '200402939'
'1047': '200402943'
'1048': '200402944'
'1049': '200402945'
'1050': '200402946'
'1051': '200402947'
'1052': '200402949'
'1053': '200402952'
'1054': '200402968'
'1055': '200402973'
'1056': '200402979'
'1057': '200402981'
'1058': '200402982'
'1059': '200402983'
'1060': '200402992'
'1061': '200403040'
'1062': '200403041'
'1063': '200403042'
'1064': '200403049'
'1065': '200403052'
'1066': '200403053'
'1067': '200403054'
'1068': '200403056'
'1069': '200403060'
'1070': '200403062'
'1071': '200403075'
'1072': '200403081'
'1073': '200403082'
'1074': '200403084'
'1075': '200403085'
'1076': '200403086'
'1077': '200403089'
'1078': '200403090'
'1079': '200403104'
'1080': '200403111'
'1081': '200403121'
'1082': '200403125'
'1083': '200403126'
'1084': '200403137'
'1085': '200403138'
'1086': '200403160'
'1087': '200403167'
'1088': '200403183'
'1089': '200403184'
'1090': '200403185'
'1091': '200403186'
'1092': '200403226'
'1093': '200403289'
'1094': '200403290'
'1095': '200403291'
'1096': '200403292'
'1097': '200403296'
'1098': '200403304'
'1099': '200403305'
'1100': '200403309'
'1101': '200403312'
'1102': '200403317'
'1103': '200403319'
'1104': '200403321'
'1105': '200403323'
'1106': '200403325'
'1107': '200403327'
'1108': '200403329'
'1109': '200403333'
'1110': '200403335'
'1111': '200403336'
'1112': '200403405'
'1113': '200403411'
'1114': '200403413'
'1115': '200403414'
'1116': '200403415'
'1117': '200403419'
'1118': '200403425'
'1119': '200403426'
'1120': '200403427'
'1121': '200403433'
'1122': '200403434'
'1123': '200403448'
'1124': '200403459'
'1125': '200403460'
'1126': '200403461'
'1127': '200403462'
'1128': '200403463'
'1129': '200403464'
'1130': '200403465'
'1131': '200403783'
'1132': '200403794'
'1133': '200403814'
'1134': '200403816'
'1135': '200403819'
'1136': '200403822'
'1137': '200403982'
'1138': '200404000'
'1139': '200404007'
'1140': '200404009'
'1141': '200404029'
'1142': '200404034'
'1143': '200404038'
'1144': '200404039'
'1145': '200404040'
'1146': '200404042'
'1147': '200404043'
'1148': '200404044'
'1149': '200404046'
'1150': '200404048'
'1151': '200404050'
'1152': '200404051'
'1153': '200404055'
'1154': '200404059'
'1155': '200404060'
'1156': '200404168'
'1157': '200404421'
'1158': '200404437'
'1159': '200404438'
'1160': '200404644'
'1161': '200404646'
'1162': '200404647'
'1163': '200404653'
'1164': '200404654'
'1165': '200404656'
'1166': '200404657'
'1167': '200404669'
'1168': '200404670'
'1169': '200404671'
'1170': '200404677'
'1171': '200404678'
'1172': '200404684'
'1173': '200404690'
'1174': '200404692'
'1175': '200404693'
'1176': '200404695'
'1177': '200404698'
'1178': '200404699'
'1179': '200404700'
'1180': '200404701'
'1181': '200404702'
'1182': '200404707'
'1183': '200404709'
'1184': '200404711'
'1185': '200404712'
'1186': '200404719'
'1187': '200404721'
'1188': '200404722'
'1189': '200404724'
'1190': '200404726'
'1191': '200404740'
'1192': '200404741'
'1193': '200404747'
'1194': '200404754'
'1195': '200404762'
'1196': '200404763'
'1197': '200404771'
'1198': '200404772'
'1199': '200404779'
'1200': '200409929'
'1201': '200409930'
'1202': '200409932'
'1203': '200409933'
'1204': '200410082'
'1205': '200410083'
'1206': '200410085'
'1207': '200410086'
'1208': '200410087'
'1209': '200410088'
'1210': '200410089'
'1211': '200410090'
'1212': '200410337'
'1213': '200410865'
'1214': '200410892'
'1215': '200410894'
'1216': '200410896'
'1217': '200410897'
'1218': '200410901'
'1219': '200410902'
'1220': '200410905'
'1221': '200410909'
'1222': '200410941'
'1223': '200410942'
'1224': '200410943'
'1225': '200410951'
'1226': '200410977'
'1227': '200410999'
'1228': '200411002'
'1229': '200411033'
'1230': '200411042'
'1231': '200411043'
'1232': '200411062'
'1233': '200411064'
'1234': '200411065'
'1235': '200411066'
'1236': '200411067'
'1237': '200411068'
'1238': '200411087'
'1239': '200411095'
'1240': '200411207'
'1241': '200411208'
'1242': '200500111'
'1243': '200500125'
'1244': '200500128'
'1245': '200500130'
'1246': '200500132'
'1247': '200500133'
'1248': '200500134'
'1249': '200500135'
'1250': '200500140'
'1251': '200500144'
'1252': '200500145'
'1253': '200500146'
'1254': '200500150'
'1255': '200500161'
'1256': '200500162'
'1257': '200500246'
'1258': '200500248'
'1259': '200500249'
'1260': '200500251'
'1261': '200500252'
'1262': '200500253'
'1263': '200500254'
'1264': '200500255'
'1265': '200500257'
'1266': '200500258'
'1267': '200500287'
'1268': '200500288'
'1269': '200500302'
'1270': '200500424'
'1271': '200500426'
'1272': '200500430'
'1273': '200500474'
'1274': '200500476'
'1275': '200500482'
'1276': '200500483'
'1277': '200500545'
'1278': '200500552'
'1279': '200500553'
'1280': '200500554'
'1281': '200500581'
'1282': '200500583'
'1283': '200500585'
'1284': '200500589'
'1285': '200500592'
'1286': '200500605'
'1287': '200500622'
'1288': '200500649'
'1289': '200500686'
'1290': '200500694'
'1291': '200500695'
'1292': '200500699'
'1293': '200500704'
'1294': '200500797'
'1295': '200500800'
'1296': '200500805'
'1297': '200500806'
'1298': '200500809'
'1299': '200500815'
'1300': '200500835'
'1301': '200500853'
'1302': '200500854'
'1303': '200500864'
'1304': '200500870'
'1305': '200500871'
'1306': '200500882'
'1307': '200500892'
'1308': '200500961'
'1309': '200500962'
'1310': '200500963'
'1311': '200500965'
'1312': '200500969'
'1313': '200501012'
'1314': '200501107'
'1315': '200501119'
'1316': '200501120'
'1317': '200501141'
'1318': '200501145'
'1319': '200501155'
'1320': '200501163'
'1321': '200501237'
'1322': '200501239'
'1323': '200501241'
'1324': '200501243'
'1325': '200501265'
'1326': '200501267'
'1327': '200501277'
'1328': '200501288'
'1329': '200501291'
'1330': '200501293'
'1331': '200501301'
'1332': '200501307'
'1333': '200501384'
'1334': '200501427'
'1335': '200501429'
'1336': '200501458'
'1337': '200501472'
'1338': '200501511'
'1339': '200501522'
'1340': '200501593'
'1341': '200501601'
'1342': '200501605'
'1343': '200501618'
'1344': '200501667'
'1345': '200501979'
'1346': '200501997'
'1347': '200501999'
'1348': '200502010'
'1349': '200502012'
'1350': '200502014'
'1351': '200502017'
'1352': '200502029'
'1353': '200502030'
'1354': '200502035'
'1355': '200502039'
'1356': '200502050'
'1357': '200502052'
'1358': '200502054'
'1359': '200502086'
'1360': '200502087'
'1361': '200502089'
'1362': '200502090'
'1363': '200502105'
'1364': '200502107'
'1365': '200502109'
'1366': '200502175'
'1367': '200502192'
'1368': '200502218'
'1369': '200502219'
'1370': '200502306'
'1371': '200502416'
'1372': '200502470'
'1373': '200502472'
'1374': '200502473'
'1375': '200502474'
'1376': '200502490'
'1377': '200502557'
'1378': '200502588'
'1379': '200502598'
'1380': '200502600'
'1381': '200502601'
'1382': '200502633'
'1383': '200502711'
'1384': '200502717'
'1385': '200502742'
'1386': '200502744'
'1387': '200502757'
'1388': '200502762'
'1389': '200502770'
'1390': '200502808'
'1391': '200502809'
'1392': '200502810'
'1393': '200502834'
'1394': '200502838'
'1395': '200502901'
'1396': '200502953'
'1397': '200503063'
'1398': '200503065'
'1399': '200503066'
'1400': '200503075'
'1401': '200503128'
'1402': '200503200'
'1403': '200503234'
'1404': '200503238'
'1405': '200503246'
'1406': '200503619'
'1407': '200503637'
'1408': '200503642'
'1409': '200503652'
'1410': '200503653'
'1411': '200503664'
'1412': '200503775'
'1413': '200503783'
'1414': '200503786'
'1415': '200503787'
'1416': '200503788'
'1417': '200503798'
'1418': '200503799'
'1419': '200503803'
'1420': '200503806'
'1421': '200503807'
'1422': '200505665'
'1423': '200511055'
'1424': '200511056'
'1425': '200511057'
'1426': '200511058'
'1427': '200511059'
'1428': '200511060'
'1429': '200511064'
'1430': '200511065'
'1431': '200511066'
'1432': '200511067'
'1433': '200511068'
'1434': '200511106'
'1435': '200511107'
'1436': '200511108'
'1437': '200511262'
'1438': '200511263'
'1439': '200511264'
'1440': '200511266'
'1441': '200511904'
'1442': '200512186'
'1443': '200512201'
'1444': '200512238'
'1445': '200512240'
'1446': '200512244'
'1447': '200600026'
'1448': '200600027'
'1449': '200600028'
'1450': '200600036'
'1451': '200600049'
'1452': '200600054'
'1453': '200600081'
'1454': '200600114'
'1455': '200600116'
'1456': '200600127'
'1457': '200600128'
'1458': '200600134'
'1459': '200600138'
'1460': '200600156'
'1461': '200600159'
'1462': '200600160'
'1463': '200600178'
'1464': '200600181'
'1465': '200600221'
'1466': '200600222'
'1467': '200600225'
'1468': '200600248'
'1469': '200600249'
'1470': '200600267'
'1471': '200600287'
'1472': '200600325'
'1473': '200600352'
'1474': '200600367'
'1475': '200600456'
'1476': '200600460'
'1477': '200600522'
'1478': '200600527'
'1479': '200600543'
'1480': '200600562'
'1481': '200600665'
'1482': '200600674'
'1483': '200600688'
'1484': '200600696'
'1485': '200600697'
'1486': '200600698'
'1487': '200600811'
'1488': '200600818'
'1489': '200600819'
'1490': '200600821'
'1491': '200603815'
'1492': '200603871'
'1493': '200603872'
'1494': '200604151'
'1495': '200604193'
'1496': '200604212'
'1497': '200604227'
'1498': '200604239'
'1499': '200604243'
'1500': '200604258'
'1501': '200604293'
'1502': '200604308'
'1503': '200604312'
'1504': '200604332'
'1505': '200604334'
'1506': '200604358'
'1507': '200604408'
'1508': '200605253'
'1509': '200605344'
'1510': '200605364'
'1511': '200605381'
'1512': '200605398'
'1513': '200605403'
'1514': '200605410'
'1515': '200605427'
'1516': '200605439'
'1517': '200605445'
'1518': '200605447'
'1519': '200605448'
'1520': '200605450'
'1521': '200605468'
'1522': '200605481'
'1523': '200605499'
'1524': '200605537'
'1525': '200606180'
'1526': '200606181'
'1527': '200606182'
'1528': '200606183'
'1529': '200606294'
'1530': '200606297'
'1531': '200606303'
'1532': '200606308'
'1533': '200606318'
'1534': '200606326'
'1535': '200606351'
'1536': '200606537'
'1537': '200606825'
'1538': '200606830'
'1539': '200606835'
'1540': '200606965'
'1541': '200606986'
'1542': '200607004'
'1543': '200607445'
'1544': '200607486'
'1545': '200607677'
'1546': '200607709'
'1547': '200607743'
'1548': '200607747'
'1549': '200607750'
'1550': '200607776'
'1551': '200607777'
'1552': '200607778'
'1553': '200607784'
'1554': '200607785'
'1555': '200607847'
'1556': '200607874'
'1557': '200607876'
'1558': '200607890'
'1559': '200607907'
'1560': '200607908'
'1561': '200608189'
'1562': '200608192'
'1563': '200608219'
'1564': '200608232'
'1565': '200608240'
'1566': '200608275'
'1567': '200608278'
'1568': '200608280'
'1569': '200608291'
'1570': '200608295'
'1571': '200608296'
'1572': '200610660'
'1573': '200610661'
'1574': '200610785'
'1575': '200610879'
'1576': '200610960'
'1577': '200610978'
'1578': '200610980'
'1579': '200610986'
'1580': '200611006'
'1581': '200611007'
'1582': '200611010'
'1583': '200611019'
'1584': '200611031'
'1585': '200611039'
'1586': '200611041'
'1587': '200611042'
'1588': '200611053'
'1589': '200611114'
'1590': '200611303'
'1591': '200611304'
'1592': '200611332'
'1593': '200611338'
'1594': '200611385'
'1595': '200611392'
'1596': '200612115'
'1597': '200612573'
'1598': '200612728'
'1599': '200612743'
'1600': '200700055'
'1601': '200700234'
'1602': '200700435'
'1603': '200700437'
'1604': '200700455'
'1605': '200700540'
'1606': '200700562'
'1607': '200700790'
'1608': '200700814'
'1609': '200700835'
'1610': '200700836'
'1611': '200700844'
'1612': '200700876'
'1613': '200700890'
'1614': '200701020'
'1615': '200701024'
'1616': '200701042'
'1617': '200701201'
'1618': '200701231'
'1619': '200701233'
'1620': '200701235'
'1621': '200701236'
'1622': '200701246'
'1623': '200701251'
'1624': '200701335'
'1625': '200701403'
'1626': '200701405'
'1627': '200701406'
'1628': '200701412'
'1629': '200701642'
'1630': '200701809'
'1631': '200701876'
'1632': '200701917'
'1633': '200701932'
'1634': '200701934'
'1635': '200702089'
'1636': '200702091'
'1637': '200702101'
'1638': '200702108'
'1639': '200702135'
'1640': '200702153'
'1641': '200702164'
'1642': '200702166'
'1643': '200702172'
'1644': '200702173'
'1645': '200702191'
'1646': '200702195'
'1647': '200702200'
'1648': '200702250'
'1649': '200702255'
'1650': '200702260'
'1651': '200702309'
'1652': '200702313'
'1653': '200702358'
'1654': '200702389'
'1655': '200702420'
'1656': '200702508'
'1657': '200702521'
'1658': '200702532'
'1659': '200702536'
'1660': '200702538'
'1661': '200702549'
'1662': '200702572'
'1663': '200702577'
'1664': '200702584'
'1665': '200702691'
'1666': '200702709'
'1667': '200702739'
'1668': '200702749'
'1669': '200702963'
'1670': '200702966'
'1671': '200702967'
'1672': '200702969'
'1673': '200702970'
'1674': '200703052'
'1675': '200703075'
'1676': '200703120'
'1677': '200703121'
'1678': '200703129'
'1679': '200703132'
'1680': '200703142'
'1681': '200703160'
'1682': '200703161'
'1683': '200703163'
'1684': '200703169'
'1685': '200703171'
'1686': '200703172'
'1687': '200703173'
'1688': '200703185'
'1689': '200703201'
'1690': '200703229'
'1691': '200703233'
'1692': '200703236'
'1693': '200703261'
'1694': '200703278'
'1695': '200703394'
'1696': '200703429'
'1697': '200703452'
'1698': '200703458'
'1699': '200703459'
'1700': '200703462'
'1701': '200703463'
'1702': '200703464'
'1703': '200703663'
'1704': '200703675'
'1705': '200703700'
'1706': '200703701'
'1707': '200703715'
'1708': '200703733'
'1709': '200703737'
'1710': '200703797'
'1711': '200703801'
'1712': '200703835'
'1713': '200703842'
'1714': '200703852'
'1715': '200703873'
'1716': '200703876'
'1717': '200703877'
'1718': '200703964'
'1719': '200704019'
'1720': '200704285'
'1721': '200704316'
'1722': '200704320'
'1723': '200704322'
'1724': '200704420'
'1725': '200704501'
'1726': '200704534'
'1727': '200704541'
'1728': '200704587'
'1729': '200704612'
'1730': '200704614'
'1731': '200704765'
'1732': '200704813'
'1733': '200704961'
'1734': '200704969'
'1735': '200705001'
'1736': '200705314'
'1737': '200705392'
'1738': '200705402'
'1739': '200705500'
'1740': '200705587'
'1741': '200705588'
'1742': '200705595'
'1743': '200705596'
'1744': '200705817'
'1745': '200706007'
'1746': '200706008'
'1747': '200706029'
'1748': '200706030'
'1749': '200706059'
'1750': '200706185'
'1751': '200706243'
'1752': '200706330'
'1753': '200706331'
'1754': '200706332'
'1755': '200706333'
'1756': '200706341'
'1757': '200706400'
'1758': '200706467'
'1759': '200706537'
'1760': '200706542'
'1761': '200706569'
'1762': '200706597'
'1763': '200706766'
'1764': '200706788'
'1765': '200706796'
'1766': '200706908'
'1767': '200706941'
'1768': '200706984'
'1769': '200707142'
'1770': '200707175'
'1771': '200707251'
'1772': '200707373'
'1773': '200707382'
'1774': '200707383'
'1775': '200707385'
'1776': '200707388'
'1777': '200707390'
'1778': '200707521'
'1779': '200707723'
'1780': '200707792'
'1781': '200707920'
'1782': '200707944'
'1783': '200707947'
'1784': '200707958'
'1785': '200708055'
'1786': '200708144'
'1787': '200708364'
'1788': '200708367'
'1789': '200708368'
'1790': '200708372'
'1791': '200708376'
'1792': '200708427'
'1793': '200708447'
'1794': '200708455'
'1795': '200708456'
'1796': '200708458'
'1797': '200708534'
'1798': '200708585'
'1799': '200708589'
'1800': '200708592'
'1801': '200708595'
'1802': '200708596'
'1803': '200708597'
'1804': '200708599'
'1805': '200708600'
'1806': '200708601'
'1807': '200708754'
'1808': '200708786'
'1809': '200708816'
'1810': '200708819'
'1811': '200708961'
'1812': '200709004'
'1813': '200709010'
'1814': '200709163'
'1815': '200709201'
'1816': '200709202'
'1817': '200709249'
'1818': '200709331'
'1819': '200709389'
'1820': '200709390'
'1821': '200709448'
'1822': '200709527'
'1823': '200709544'
'1824': '200709638'
'1825': '200709640'
'1826': '200709641'
'1827': '200709794'
'1828': '200709818'
'1829': '200709901'
'1830': '200709902'
'1831': '200709913'
'1832': '200709915'
'1833': '200710087'
'1834': '200710374'
'1835': '200710524'
'1836': '200710628'
'1837': '200710758'
'1838': '200710759'
'1839': '200710760'
'1840': '200710808'
'1841': '200710855'
'1842': '200710856'
'1843': '200710957'
'1844': '200710998'
'1845': '200711148'
'1846': '200711216'
'1847': '200711286'
'1848': '200711309'
'1849': '200711422'
'1850': '200711426'
'1851': '200711456'
'1852': '200711527'
'1853': '200711589'
'1854': '200711634'
'1855': '200711885'
'1856': '200711886'
'1857': '200711927'
'1858': '200711934'
'1859': '200711953'
'1860': '200712070'
'1861': '200712241'
'1862': '200712325'
'1863': '200712401'
'1864': '200712587'
'1865': '200712588'
'1866': '200712647'
'1867': '200712648'
'1868': '200712742'
'1869': '200712789'
'1870': '200712868'
'1871': '200712885'
'1872': '200713004'
'1873': '200713008'
'1874': '200713009'
'1875': '200713032'
'1876': '200713079'
'1877': '200713170'
'1878': '200713296'
'1879': '200713517'
'1880': '200713591'
'1881': '200713685'
'1882': '200713688'
'1883': '200713885'
'1884': '200713886'
'1885': '200713906'
'1886': '200713912'
'1887': '200800284'
'1888': '200800291'
'1889': '200800316'
'1890': '200800320'
'1891': '200800330'
'1892': '200800331'
'1893': '200800339'
'1894': '200800347'
'1895': '200800405'
'1896': '200800411'
'1897': '200800569'
'1898': '200800614'
'1899': '200801141'
'1900': '200801284'
'1901': '200801332'
'1902': '200801333'
'1903': '200801350'
'1904': '200801352'
'1905': '200801383'
'1906': '200801460'
'1907': '200801517'
'1908': '200801564'
'1909': '200801686'
'1910': '200801763'
'1911': '200801812'
'1912': '200801814'
'1913': '200801822'
'1914': '200801836'
'1915': '200802022'
'1916': '200802064'
'1917': '200802066'
'1918': '200802182'
'1919': '200802325'
'1920': '200802327'
'1921': '200802392'
'1922': '200802547'
'1923': '200802557'
'1924': '200802558'
'1925': '200802788'
'1926': '200802853'
'1927': '200802965'
'1928': '200803000'
'1929': '200803148'
'1930': '200803200'
'1931': '200803246'
'1932': '200803290'
'1933': '200803292'
'1934': '200803294'
'1935': '200803417'
'1936': '200803428'
'1937': '200803434'
'1938': '200803641'
'1939': '200803693'
'1940': '200803701'
'1941': '200803737'
'1942': '200803803'
'1943': '200803972'
'1944': '200804092'
'1945': '200804225'
'1946': '200804412'
'1947': '200804519'
'1948': '200804556'
'1949': '200804557'
'1950': '200804560'
'1951': '200804563'
'1952': '200804586'
'1953': '200804611'
'1954': '200804633'
'1955': '200804796'
'1956': '200804798'
'1957': '200804802'
'1958': '200804810'
'1959': '200804912'
'1960': '200804947'
'1961': '200804956'
'1962': '200805037'
'1963': '200805064'
'1964': '200805085'
'1965': '200805090'
'1966': '200805113'
'1967': '200805124'
'1968': '200805152'
'1969': '200805214'
'1970': '200805215'
'1971': '200805262'
'1972': '200805275'
'1973': '200805290'
'1974': '200805301'
'1975': '200805341'
'1976': '200805353'
'1977': '200805387'
'1978': '200805408'
'1979': '200805418'
'1980': '200805421'
'1981': '200805455'
'1982': '200805495'
'1983': '200805550'
'1984': '200805595'
'1985': '200805598'
'1986': '200805599'
'1987': '200805632'
'1988': '200805676'
'1989': '200805680'
'1990': '200805734'
'1991': '200805735'
'1992': '200805737'
'1993': '200805739'
'1994': '200805740'
'1995': '200805748'
'1996': '200805749'
'1997': '200805752'
'1998': '200805810'
'1999': '200805812'
'2000': '200805818'
'2001': '200805819'
'2002': '200805821'
'2003': '200805842'
'2004': '200805849'
'2005': '200805895'
'2006': '200805901'
'2007': '200805915'
'2008': '200805918'
'2009': '200805919'
'2010': '200805944'
'2011': '200805986'
'2012': '200805990'
'2013': '200805991'
'2014': '200806022'
'2015': '200806024'
'2016': '200806190'
'2017': '200806191'
'2018': '200806192'
'2019': '200806194'
'2020': '200806299'
'2021': '200806361'
'2022': '200806365'
'2023': '200806368'
'2024': '200806373'
'2025': '200806455'
'2026': '200806634'
'2027': '200806668'
'2028': '200806736'
'2029': '200806769'
'2030': '200806842'
'2031': '200806896'
'2032': '200806977'
'2033': '200807044'
'2034': '200807051'
'2035': '200807066'
'2036': '200807067'
'2037': '200807068'
'2038': '200807203'
'2039': '200807204'
'2040': '200807205'
'2041': '200807208'
'2042': '200807210'
'2043': '200807211'
'2044': '200807223'
'2045': '200807229'
'2046': '200807230'
'2047': '200807253'
'2048': '200807310'
'2049': '200807361'
'2050': '200807363'
'2051': '200807365'
'2052': '200807367'
'2053': '200807374'
'2054': '200807377'
'2055': '200807391'
'2056': '200807418'
'2057': '200807430'
'2058': '200807431'
'2059': '200807469'
'2060': '200807526'
'2061': '200807527'
'2062': '200807528'
'2063': '200807539'
'2064': '200807552'
'2065': '200807582'
'2066': '200807617'
'2067': '200807618'
'2068': '200807633'
'2069': '200807683'
'2070': '200807684'
'2071': '200807685'
'2072': '200807813'
'2073': '200807814'
'2074': '200807945'
'2075': '200807949'
'2076': '200807985'
'2077': '200807986'
'2078': '200807989'
'2079': '200808076'
'2080': '200808078'
'2081': '200808091'
'2082': '200808126'
'2083': '200808127'
'2084': '200808195'
'2085': '200808219'
'2086': '200808244'
'2087': '200808245'
'2088': '200808272'
'2089': '200808273'
'2090': '200808309'
'2091': '200808362'
'2092': '200808377'
'2093': '200808511'
'2094': '200808562'
'2095': '200808715'
'2096': '200808774'
'2097': '200808790'
'2098': '200808828'
'2099': '200808862'
'2100': '200808985'
'2101': '200809045'
'2102': '200809058'
'2103': '200809076'
'2104': '200809079'
'2105': '200809084'
'2106': '200809185'
'2107': '200809207'
'2108': '200809208'
'2109': '200809240'
'2110': '200809258'
'2111': '200809388'
'2112': '200809402'
'2113': '200809403'
'2114': '200809464'
'2115': '200809649'
'2116': '200809791'
'2117': '200809813'
'2118': '200809818'
'2119': '200809819'
'2120': '200809825'
'2121': '200809902'
'2122': '200809909'
'2123': '200809910'
'2124': '200810003'
'2125': '200810004'
'2126': '200810005'
'2127': '200810006'
'2128': '200810031'
'2129': '200810113'
'2130': '200810114'
'2131': '200810125'
'2132': '200810237'
'2133': '200810314'
'2134': '200810320'
'2135': '200810340'
'2136': '200810424'
'2137': '200810425'
'2138': '200810427'
'2139': '200810440'
'2140': '200810513'
'2141': '200810517'
'2142': '200810584'
'2143': '200810700'
'2144': '200810702'
'2145': '200810703'
'2146': '200810727'
'2147': '200810760'
'2148': '200810870'
'2149': '200810940'
'2150': '200810973'
'2151': '200810981'
'2152': '200811250'
'2153': '200811251'
'2154': '200811253'
'2155': '200811262'
'2156': '200811364'
'2157': '200811406'
'2158': '200811564'
'2159': '200811608'
'2160': '200811623'
'2161': '200811629'
'2162': '200811701'
'2163': '200811736'
'2164': '200811814'
'2165': '200811885'
'2166': '200900194'
'2167': '200900196'
'2168': '200900341'
'2169': '200900606'
'2170': '200900668'
'2171': '200900671'
'2172': '200900674'
'2173': '200900767'
'2174': '200900878'
'2175': '200900889'
'2176': '200900892'
'2177': '200900895'
'2178': '200900897'
'2179': '200900899'
'2180': '200900967'
'2181': '200900997'
'2182': '200901007'
'2183': '200901073'
'2184': '200901207'
'2185': '200901232'
'2186': '200901456'
'2187': '200901507'
'2188': '200901536'
'2189': '200901544'
'2190': '200901619'
'2191': '200901817'
'2192': '200902042'
'2193': '200902080'
'2194': '200902207'
'2195': '200902213'
'2196': '200902281'
'2197': '200902301'
'2198': '200902443'
'2199': '200902507'
'2200': '200902537'
'2201': '200903041'
'2202': '200903043'
'2203': '200903115'
'2204': '200903125'
'2205': '200903409'
'2206': '200903782'
'2207': '200903801'
'2208': '200903974'
'2209': '200904162'
'2210': '200904163'
'2211': '200904234'
'2212': '200904308'
'2213': '200904325'
'2214': '200904514'
'2215': '200904538'
'2216': '200904548'
'2217': '200904649'
'2218': '200904678'
'2219': '200904689'
'2220': '200904855'
'2221': '200904880'
'2222': '200904881'
'2223': '200904990'
'2224': '200905004'
'2225': '200905017'
'2226': '200905053'
'2227': '200905055'
'2228': '200905151'
'2229': '200905163'
'2230': '200905196'
'2231': '200905218'
'2232': '200905315'
'2233': '200905403'
'2234': '200905653'
'2235': '200905876'
'2236': '200905938'
'2237': '200905939'
'2238': '200906073'
'2239': '200906330'
'2240': '200906384'
'2241': '200906557'
'2242': '200906562'
'2243': '200906578'
'2244': '200906599'
'2245': '200906602'
'2246': '200906966'
'2247': '200907093'
'2248': '200907385'
'2249': '200907446'
'2250': '200907725'
'2251': '200908116'
'2252': '200908185'
'2253': '200908186'
'2254': '200908333'
'2255': '200908369'
'2256': '200908478'
'2257': '200908479'
'2258': '200908781'
'2259': '200908817'
'2260': '200908819'
'2261': '200908825'
'2262': '200908953'
'2263': '200909377'
'2264': '201000300'
'2265': '201000406'
'2266': '201000637'
'2267': '201000674'
'2268': '201000826'
'2269': '201000869'
'2270': '201000874'
'2271': '201000875'
'2272': '201000877'
'2273': '201000878'
'2274': '201000951'
'2275': '201000963'
'2276': '201000967'
'2277': '201000972'
'2278': '201000974'
'2279': '201000975'
'2280': '201000977'
'2281': '201000980'
'2282': '201000981'
'2283': '201000982'
'2284': '201000985'
'2285': '201000989'
'2286': '201000993'
'2287': '201000994'
'2288': '201000998'
'2289': '201000999'
'2290': '201001001'
'2291': '201001003'
'2292': '201001004'
'2293': '201001005'
'2294': '201001006'
'2295': '201001007'
'2296': '201001008'
'2297': '201001013'
'2298': '201001017'
'2299': '201001018'
'2300': '201001021'
'2301': '201001022'
'2302': '201001023'
'2303': '201001024'
'2304': '201001057'
'2305': '201001424'
'2306': '201001547'
'2307': '201001640'
'2308': '201001656'
'2309': '201001682'
'2310': '201001683'
'2311': '201001685'
'2312': '201001699'
'2313': '201001707'
'2314': '201001723'
'2315': '201001731'
'2316': '201001744'
'2317': '201002151'
'2318': '201002277'
'2319': '201002395'
'2320': '201002396'
'2321': '201002402'
'2322': '201002449'
'2323': '201002450'
'2324': '201002503'
'2325': '201003043'
'2326': '201003049'
'2327': '201003203'
'2328': '201003206'
'2329': '201003207'
'2330': '201003213'
'2331': '201003370'
'2332': '201003397'
'2333': '201003398'
'2334': '201003713'
'2335': '201004170'
'2336': '201004204'
'2337': '201004230'
'2338': '201004257'
'2339': '201004440'
'2340': '201004538'
'2341': '201004680'
'2342': '201004693'
'2343': '201004700'
'2344': '201004997'
'2345': '201005059'
'2346': '201005070'
'2347': '201005121'
'2348': '201005235'
'2349': '201005236'
'2350': '201005685'
'2351': '201005699'
'2352': '201005763'
'2353': '201005770'
'2354': '201005771'
'2355': '201005774'
'2356': '201005775'
'2357': '201005841'
'2358': '201006075'
'2359': '201006077'
'2360': '201006120'
'2361': '201006205'
'2362': '201006206'
'2363': '201006211'
'2364': '201006442'
'2365': '201006443'
'2366': '201006495'
'2367': '201006592'
'2368': '201006784'
'2369': '201006870'
'2370': '201006871'
'2371': '201007233'
'2372': '201007234'
'2373': '201007276'
'2374': '201007277'
'2375': '201007281'
'2376': '201007401'
'2377': '201007460'
'2378': '201007499'
'2379': '201007523'
'2380': '201007529'
'2381': '201007569'
'2382': '201007652'
'2383': '201007672'
'2384': '201007708'
'2385': '201007750'
'2386': '201007774'
'2387': '201007782'
'2388': '201007783'
'2389': '201100026'
'2390': '201100231'
'2391': '201100273'
'2392': '201100299'
'2393': '201100375'
'2394': '201100467'
'2395': '201100491'
'2396': '201100500'
'2397': '201100502'
'2398': '201100572'
'2399': '201100792'
'2400': '201100793'
'2401': '201100794'
'2402': '201100795'
'2403': '201100797'
'2404': '201100798'
'2405': '201100799'
'2406': '201100800'
'2407': '201100801'
'2408': '201100802'
'2409': '201100821'
'2410': '201100826'
'2411': '201100827'
'2412': '201100828'
'2413': '201100866'
'2414': '201100867'
'2415': '201100869'
'2416': '201100888'
'2417': '201100889'
'2418': '201100894'
'2419': '201100898'
'2420': '201100900'
'2421': '201100901'
'2422': '201100902'
'2423': '201100904'
'2424': '201100905'
'2425': '201100906'
'2426': '201100908'
'2427': '201100911'
'2428': '201100913'
'2429': '201100945'
'2430': '201100946'
'2431': '201101004'
'2432': '201101014'
'2433': '201101022'
'2434': '201101038'
'2435': '201101640'
'2436': '201101643'
'2437': '201101644'
'2438': '201101756'
'2439': '201101761'
'2440': '201101840'
'2441': '201101843'
'2442': '201101844'
'2443': '201101845'
'2444': '201101846'
'2445': '201101847'
'2446': '201101857'
'2447': '201101860'
'2448': '201101861'
'2449': '201101884'
'2450': '201101885'
'2451': '201102450'
'2452': '201102476'
'2453': '201102477'
'2454': '201102757'
'2455': '201102767'
'2456': '201102768'
'2457': '201102769'
'2458': '201102784'
'2459': '201103159'
'2460': '201103161'
'2461': '201103324'
'2462': '201103422'
'2463': '201103424'
'2464': '201103425'
'2465': '201103468'
'2466': '201103469'
'2467': '201103472'
'2468': '201103475'
'2469': '201103563'
'2470': '201103609'
'2471': '201103615'
'2472': '201103617'
'2473': '201103909'
'2474': '201103986'
'2475': '201104026'
'2476': '201104041'
'2477': '201104062'
'2478': '201104063'
'2479': '201104066'
'2480': '201104111'
'2481': '201104471'
'2482': '201104754'
'2483': '201104755'
'2484': '201104801'
'2485': '201105016'
'2486': '201105106'
'2487': '201105143'
'2488': '201105144'
'2489': '201105270'
'2490': '201105412'
'2491': '201105413'
'2492': '201105484'
'2493': '201105487'
'2494': '201105488'
'2495': '201105600'
'2496': '201105697'
'2497': '201105756'
'2498': '201105914'
'2499': '201105915'
'2500': '201105918'
'2501': '201106145'
'2502': '201106366'
'2503': '201106390'
'2504': '201106781'
'2505': '201106782'
'2506': '201106947'
'2507': '201106949'
'2508': '201106952'
'2509': '201106953'
'2510': '201107002'
'2511': '201107176'
'2512': '201107199'
'2513': '201107201'
'2514': '201107219'
'2515': '201107220'
'2516': '201107224'
'2517': '201107229'
'2518': '201107233'
'2519': '201107234'
'2520': '201107272'
'2521': '201107273'
'2522': '201107290'
'2523': '201107503'
'2524': '201107505'
'2525': '201107507'
'2526': '201107509'
'2527': '201107512'
'2528': '201107513'
'2529': '201107515'
'2530': '201107516'
'2531': '201107525'
'2532': '201107526'
'2533': '201107530'
'2534': '201107536'
'2535': '201107539'
'2536': '201107542'
'2537': '201107543'
'2538': '201108989'
'2539': '201108990'
'2540': '201109024'
'2541': '201109030'
'2542': '201109142'
'2543': '201109150'
'2544': '201109151'
'2545': '201109154'
'2546': '201109155'
'2547': '201109222'
'2548': '201109223'
'2549': '201109224'
'2550': '201109225'
'2551': '201109227'
'2552': '201109228'
'2553': '201109231'
'2554': '201109232'
'2555': '201109259'
'2556': '201109260'
'2557': '201109276'
'2558': '201109277'
'2559': '201109291'
'2560': '201109296'
'2561': '201109309'
'2562': '201109310'
'2563': '201109313'
'2564': '201109442'
'2565': '201109514'
'2566': '201109733'
'2567': '201109818'
'2568': '201110023'
'2569': '201110095'
'2570': '201110136'
'2571': '201110192'
'2572': '201110292'
'2573': '201110398'
'2574': '201110512'
'2575': '201110516'
'2576': '201110646'
'2577': '201111149'
'2578': '201112194'
'2579': '201112195'
'2580': '201200003'
'2581': '201200004'
'2582': '201200009'
'2583': '201200186'
'2584': '201200204'
'2585': '201200246'
'2586': '201200277'
'2587': '201200586'
'2588': '201200603'
'2589': '201200635'
'2590': '201200636'
'2591': '201200678'
'2592': '201200717'
'2593': '201200854'
'2594': '201200930'
'2595': '201200975'
'2596': '201201166'
'2597': '201201261'
'2598': '201201265'
'2599': '201201601'
'2600': '201201602'
'2601': '201201680'
'2602': '201201690'
'2603': '201201693'
'2604': '201201694'
'2605': '201201837'
'2606': '201202398'
'2607': '201202780'
'2608': '201202801'
'2609': '201202902'
'2610': '201203127'
'2611': '201203128'
'2612': '201203513'
'2613': '201203514'
'2614': '201203607'
'2615': '201203608'
'2616': '201203744'
'2617': '201203745'
'2618': '201203928'
'2619': '201204330'
'2620': '201204389'
'2621': '201204555'
'2622': '201204813'
'2623': '201204970'
'2624': '201204971'
'2625': '201204975'
'2626': '201205043'
'2627': '201205150'
'2628': '201205250'
'2629': '201205252'
'2630': '201205253'
'2631': '201205255'
'2632': '201205261'
'2633': '201205262'
'2634': '201205263'
'2635': '201205266'
'2636': '201205414'
'2637': '201205415'
'2638': '201205623'
'2639': '201205628'
'2640': '201205653'
'2641': '201205654'
'2642': '201205825'
'2643': '201205826'
'2644': '201205827'
'2645': '201205828'
'2646': '201205829'
'2647': '201205833'
'2648': '201205930'
'2649': '201205939'
'2650': '201205942'
'2651': '201206020'
'2652': '201206056'
'2653': '201206160'
'2654': '201206161'
'2655': '201206162'
'2656': '201206163'
'2657': '201206166'
'2658': '201206176'
'2659': '201206201'
'2660': '201206205'
'2661': '201206250'
'2662': '201206297'
'2663': '201206327'
'2664': '201206328'
'2665': '201206338'
'2666': '201206403'
'2667': '201206410'
'2668': '201206435'
'2669': '201206489'
'2670': '201206493'
'2671': '201206494'
'2672': '201206499'
'2673': '201206540'
'2674': '201206552'
'2675': '201206553'
'2676': '201206575'
'2677': '201206612'
'2678': '201206613'
'2679': '201206667'
'2680': '201206685'
'2681': '201206704'
'2682': '201206741'
'2683': '201206751'
'2684': '201206760'
'2685': '201206761'
'2686': '201206776'
'2687': '201206804'
'2688': '201206805'
'2689': '201206831'
'2690': '201206844'
'2691': '201206846'
'2692': '201206856'
'2693': '201206857'
'2694': '201206858'
'2695': '201206870'
'2696': '201206871'
'2697': '201206872'
'2698': '201206877'
'2699': '201206904'
'2700': '201206905'
'2701': '201206916'
'2702': '201206934'
'2703': '201206935'
'2704': '201206936'
'2705': '201206953'
'2706': '201206954'
'2707': '201207002'
'2708': '201207003'
'2709': '201207004'
'2710': '201207009'
'2711': '201207014'
'2712': '201207015'
'2713': '201207061'
'2714': '201207067'
'2715': '201207105'
'2716': '201207217'
'2717': '201207236'
'2718': '201207292'
'2719': '201207294'
'2720': '201207300'
'2721': '201207301'
'2722': '201207402'
'2723': '201207403'
'2724': '201207404'
'2725': '201207405'
'2726': '201207417'
'2727': '201207453'
'2728': '201207459'
'2729': '201207461'
'2730': '201207469'
'2731': '201207486'
'2732': '201207488'
'2733': '201207513'
'2734': '201207516'
'2735': '201207517'
'2736': '201207537'
'2737': '201207538'
'2738': '201207539'
'2739': '201207540'
'2740': '201207541'
'2741': '201207575'
'2742': '201207577'
'2743': '201207586'
'2744': '201207589'
'2745': '201207592'
'2746': '201207593'
'2747': '201207595'
'2748': '201207596'
'2749': '201207597'
'2750': '201207688'
'2751': '201207721'
'2752': '201207722'
'2753': '201208078'
'2754': '201208080'
'2755': '201208081'
'2756': '201208110'
'2757': '201208193'
'2758': '201208256'
'2759': '201208462'
'2760': '201208467'
'2761': '201208480'
'2762': '201208482'
'2763': '201208483'
'2764': '201208484'
'2765': '201208518'
'2766': '201208629'
'2767': '201208630'
'2768': '201208632'
'2769': '201208643'
'2770': '201208653'
'2771': '201208654'
'2772': '201208655'
'2773': '201208723'
'2774': '201208740'
'2775': '201208753'
'2776': '201208754'
'2777': '201208855'
'2778': '201208875'
'2779': '201209444'
'2780': '201209757'
'2781': '201210163'
'2782': '201210185'
'2783': '201210187'
'2784': '201210235'
'2785': '201210240'
'2786': '201210744'
'2787': '201211055'
'2788': '201211103'
'2789': '201211564'
'2790': '201300114'
'2791': '201300128'
'2792': '201300139'
'2793': '201300148'
'2794': '201300152'
'2795': '201301236'
'2796': '201301369'
'2797': '201301377'
'2798': '201301498'
'2799': '201301586'
'2800': '201301591'
'2801': '201301621'
'2802': '201301656'
'2803': '201302165'
'2804': '201302169'
'2805': '201302260'
'2806': '201302390'
'2807': '201302618'
'2808': '201302619'
'2809': '201302802'
'2810': '201302936'
'2811': '201302948'
'2812': '201302980'
'2813': '201302982'
'2814': '201302990'
'2815': '201303029'
'2816': '201303030'
'2817': '201303035'
'2818': '201303045'
'2819': '201303269'
'2820': '201304289'
'2821': '201304386'
'2822': '201305008'
'2823': '201305030'
'2824': '201305371'
'2825': '201305403'
'2826': '201305533'
'2827': '201305813'
'2828': '201305827'
'2829': '201305832'
'2830': '201306036'
'2831': '201306176'
'2832': '201306203'
'2833': '201306230'
'2834': '201306262'
'2835': '201306264'
'2836': '201306277'
'2837': '201306289'
'2838': '201306315'
'2839': '201306327'
'2840': '201306328'
'2841': '201306329'
'2842': '201306340'
'2843': '201306344'
'2844': '201306352'
'2845': '201306362'
'2846': '201306363'
'2847': '201306366'
'2848': '201306367'
'2849': '201306368'
'2850': '201306369'
'2851': '201306370'
'2852': '201306376'
'2853': '201306377'
'2854': '201306379'
'2855': '201306383'
'2856': '201306385'
'2857': '201306386'
'2858': '201306404'
'2859': '201306407'
'2860': '201306412'
'2861': '201306419'
'2862': '201306425'
'2863': '201306452'
'2864': '201306453'
'2865': '201306454'
'2866': '201306455'
'2867': '201306549'
'2868': '201306824'
'2869': '201306917'
'2870': '201306922'
'2871': '201306942'
'2872': '201306954'
'2873': '201306969'
'2874': '201306990'
'2875': '201307038'
'2876': '201307042'
'2877': '201307062'
'2878': '201307079'
'2879': '201307092'
'2880': '201307108'
'2881': '201307118'
'2882': '201307120'
'2883': '201307121'
'2884': '201307123'
'2885': '201307124'
'2886': '201307131'
'2887': '201307132'
'2888': '201307149'
'2889': '201307158'
'2890': '201307209'
'2891': '201307243'
'2892': '201307261'
'2893': '201307290'
'2894': '201307342'
'2895': '201307357'
'2896': '201307377'
'2897': '201307379'
'2898': '201307416'
'2899': '201307449'
'2900': '201307476'
'2901': '201307484'
'2902': '201307514'
'2903': '201307515'
'2904': '201307539'
'2905': '201307558'
'2906': '201307559'
'2907': '201307562'
'2908': '201307567'
'2909': '201307570'
'2910': '201307607'
'2911': '201307655'
'2912': '201307682'
'2913': '201307687'
'2914': '201307725'
'2915': '201307726'
'2916': '201307767'
'2917': '201307807'
'2918': '201307881'
'2919': '201307882'
'2920': '201307888'
'2921': '201307896'
'2922': '201307904'
'2923': '201307925'
'2924': '201307926'
'2925': '201307927'
'2926': '201307936'
'2927': '201307937'
'2928': '201307947'
'2929': '201307988'
'2930': '201308005'
'2931': '201308053'
'2932': '201308062'
'2933': '201308067'
'2934': '201308099'
'2935': '201308195'
'2936': '201308197'
'2937': '201308209'
'2938': '201308223'
'2939': '201308236'
'2940': '201308242'
'2941': '201308278'
'2942': '201308287'
'2943': '201308288'
'2944': '201308289'
'2945': '201308295'
'2946': '201308303'
'2947': '201308368'
'2948': '201308371'
'2949': '201308398'
'2950': '201308399'
'2951': '201308416'
'2952': '201308434'
'2953': '201308448'
'2954': '201308450'
'2955': '201308451'
'2956': '201308457'
'2957': '201308458'
'2958': '201308459'
'2959': '201308460'
'2960': '201308461'
'2961': '201308462'
'2962': '201308463'
'2963': '201308464'
'2964': '201308466'
'2965': '201308467'
'2966': '201308468'
'2967': '201308469'
'2968': '201308470'
'2969': '201308476'
'2970': '201308478'
'2971': '201308479'
'2972': '201308480'
'2973': '201308481'
'2974': '201308483'
'2975': '201308485'
'2976': '201308488'
'2977': '201308489'
'2978': '201308490'
'2979': '201308494'
'2980': '201308495'
'2981': '201308496'
'2982': '201308500'
'2983': '201308502'
'2984': '201308504'
'2985': '201308506'
'2986': '201308509'
'2987': '201308524'
'2988': '201308525'
'2989': '201308526'
'2990': '201308537'
'2991': '201308538'
'2992': '201308539'
'2993': '201308540'
'2994': '201308541'
'2995': '201308550'
'2996': '201308552'
'2997': '201308553'
'2998': '201308564'
'2999': '201308566'
'3000': '201308567'
'3001': '201308573'
'3002': '201308574'
'3003': '201308589'
'3004': '201308613'
'3005': '201308626'
'3006': '201308651'
'3007': '201308652'
'3008': '201308671'
'3009': '201308712'
'3010': '201308718'
'3011': '201308824'
'3012': '201308859'
'3013': '201308895'
'3014': '201308906'
'3015': '201308923'
'3016': '201308924'
'3017': '201308926'
'3018': '201308928'
'3019': '201308943'
'3020': '201308977'
'3021': '201309019'
'3022': '201309044'
'3023': '201309067'
'3024': '201309068'
'3025': '201309070'
'3026': '201309071'
'3027': '201309088'
'3028': '201309092'
'3029': '201309126'
'3030': '201309133'
'3031': '201309139'
'3032': '201309145'
'3033': '201309146'
'3034': '201309164'
'3035': '201309176'
'3036': '201309180'
'3037': '201309184'
'3038': '201309266'
'3039': '201309269'
'3040': '201309336'
'3041': '201309360'
'3042': '201309384'
'3043': '201309416'
'3044': '201309417'
'3045': '201309418'
'3046': '201309425'
'3047': '201309444'
'3048': '201309449'
'3049': '201309460'
'3050': '201309480'
'3051': '201309481'
'3052': '201309484'
'3053': '201309488'
'3054': '201309494'
'3055': '201309539'
'3056': '201309566'
'3057': '201309576'
'3058': '201309599'
'3059': '201309682'
'3060': '201309684'
'3061': '201309738'
'3062': '201309741'
'3063': '201309743'
'3064': '201309744'
'3065': '201309809'
'3066': '201309814'
'3067': '201309848'
'3068': '201309850'
'3069': '201309883'
'3070': '201309888'
'3071': '201309895'
'3072': '201309896'
'3073': '201309897'
'3074': '201309926'
'3075': '201309959'
'3076': '201309963'
'3077': '201309964'
'3078': '201309973'
'3079': '201309994'
'3080': '201310000'
'3081': '201310013'
'3082': '201310206'
'3083': '201310293'
'3084': '201310300'
'3085': '201310318'
'3086': '201310334'
'3087': '201310348'
'3088': '201310352'
'3089': '201310361'
'3090': '201310362'
'3091': '201310380'
'3092': '201310381'
'3093': '201310382'
'3094': '201310387'
'3095': '201310393'
'3096': '201310399'
'3097': '201310402'
'3098': '201310409'
'3099': '201310587'
'3100': '201310588'
'3101': '201310589'
'3102': '201310590'
'3103': '201310619'
'3104': '201310620'
'3105': '201310720'
'3106': '201310742'
'3107': '201310756'
'3108': '201310764'
'3109': '201310777'
'3110': '201310780'
'3111': '201310781'
'3112': '201310782'
'3113': '201310783'
'3114': '201310785'
'3115': '201310797'
'3116': '201310798'
'3117': '201310820'
'3118': '201310826'
'3119': '201310830'
'3120': '201310841'
'3121': '201310842'
'3122': '201310869'
'3123': '201310895'
'3124': '201310896'
'3125': '201310897'
'3126': '201310931'
'3127': '201310935'
'3128': '201310964'
'3129': '201310977'
'3130': '201310978'
'3131': '201310991'
'3132': '201400024'
'3133': '201400053'
'3134': '201400099'
'3135': '201400118'
'3136': '201400120'
'3137': '201400193'
'3138': '201400200'
'3139': '201400262'
'3140': '201400286'
'3141': '201400288'
'3142': '201400289'
'3143': '201400290'
'3144': '201400293'
'3145': '201400337'
'3146': '201400352'
'3147': '201400366'
'3148': '201400367'
'3149': '201400372'
'3150': '201400381'
'3151': '201400382'
'3152': '201400385'
'3153': '201400386'
'3154': '201400387'
'3155': '201400392'
'3156': '201400393'
'3157': '201400409'
'3158': '201400410'
'3159': '201400422'
'3160': '201400426'
'3161': '201400429'
'3162': '201400433'
'3163': '201400434'
'3164': '201400435'
'3165': '201400438'
'3166': '201400451'
'3167': '201400452'
'3168': '201400493'
'3169': '201400497'
'3170': '201400537'
'3171': '201400538'
'3172': '201400681'
'3173': '201400706'
'3174': '201400757'
'3175': '201400771'
'3176': '201400785'
'3177': '201400804'
'3178': '201400828'
'3179': '201400834'
'3180': '201400842'
'3181': '201400873'
'3182': '201400918'
'3183': '201400931'
'3184': '201400932'
'3185': '201400933'
'3186': '201400980'
'3187': '201401034'
'3188': '201401057'
'3189': '201401098'
'3190': '201401129'
'3191': '201401226'
'3192': '201401227'
'3193': '201401235'
'3194': '201401334'
'3195': '201401343'
'3196': '201401344'
'3197': '201401372'
'3198': '201401376'
'3199': '201401417'
'3200': '201401456'
'3201': '201401475'
'3202': '201401487'
'3203': '201401489'
'3204': '201401490'
'3205': '201401518'
'3206': '201401569'
'3207': '201401591'
'3208': '201401625'
'3209': '201401626'
'3210': '201401730'
'3211': '201401871'
'3212': '201401904'
'3213': '201401921'
'3214': '201401923'
'3215': '201401937'
'3216': '201401965'
'3217': '201401966'
'3218': '201401984'
'3219': '201401985'
'3220': '201401986'
'3221': '201401996'
'3222': '201402039'
'3223': '201402056'
'3224': '201402065'
'3225': '201402070'
'3226': '201402071'
'3227': '201402074'
'3228': '201402137'
'3229': '201402148'
'3230': '201402159'
'3231': '201402185'
'3232': '201402206'
'3233': '201402245'
'3234': '201402272'
'3235': '201402276'
'3236': '201402309'
'3237': '201402414'
'3238': '201402416'
'3239': '201402427'
'3240': '201402435'
'3241': '201402436'
'3242': '201402450'
'3243': '201402463'
'3244': '201402464'
'3245': '201402560'
'3246': '201402579'
'3247': '201402641'
'3248': '201402642'
'3249': '201402652'
'3250': '201402682'
'3251': '201402707'
'3252': '201402711'
'3253': '201402714'
'3254': '201402733'
'3255': '201402734'
'3256': '201402804'
'3257': '201402806'
'3258': '201402822'
'3259': '201402828'
'3260': '201402829'
'3261': '201402850'
'3262': '201402856'
'3263': '201402864'
'3264': '201402865'
'3265': '201402866'
'3266': '201402868'
'3267': '201402869'
'3268': '201402875'
'3269': '201402880'
'3270': '201402890'
'3271': '201402909'
'3272': '201402912'
'3273': '201402913'
'3274': '201402936'
'3275': '201402997'
'3276': '201403012'
'3277': '201403023'
'3278': '201403026'
'3279': '201403029'
'3280': '201403031'
'3281': '201403051'
'3282': '201403070'
'3283': '201403088'
'3284': '201403118'
'3285': '201403126'
'3286': '201403127'
'3287': '201403146'
'3288': '201403148'
'3289': '201403152'
'3290': '201403184'
'3291': '201403185'
'3292': '201403217'
'3293': '201403243'
'3294': '201403253'
'3295': '201403298'
'3296': '201403332'
'3297': '201403337'
'3298': '201403380'
'3299': '201403382'
'3300': '201403388'
'3301': '201403389'
'3302': '201403391'
'3303': '201403398'
'3304': '201403407'
'3305': '201403408'
'3306': '201403417'
'3307': '201403429'
'3308': '201403455'
'3309': '201403456'
'3310': '201403457'
'3311': '201403497'
'3312': '201403501'
'3313': '201403518'
'3314': '201403519'
'3315': '201403533'
'3316': '201403537'
'3317': '201403543'
'3318': '201403550'
'3319': '201403559'
'3320': '201403560'
'3321': '201403567'
'3322': '201403570'
'3323': '201403571'
'3324': '201403572'
'3325': '201403597'
'3326': '201403598'
'3327': '201403617'
'3328': '201403627'
'3329': '201403650'
'3330': '201403692'
'3331': '201403693'
'3332': '201403694'
'3333': '201403697'
'3334': '201403698'
'3335': '201403726'
'3336': '201403732'
'3337': '201403734'
'3338': '201403739'
'3339': '201403797'
'3340': '201403816'
'3341': '201403819'
'3342': '201403820'
'3343': '201403826'
'3344': '201403833'
'3345': '201403870'
'3346': '201403877'
'3347': '201403880'
'3348': '201403884'
'3349': '201403886'
'3350': '201403889'
'3351': '201403891'
'3352': '201403892'
'3353': '201403894'
'3354': '201403919'
'3355': '201403930'
'3356': '201403939'
'3357': '201403943'
'3358': '201403944'
'3359': '201403945'
'3360': '201403947'
'3361': '201403952'
'3362': '201403955'
'3363': '201403957'
'3364': '201403967'
'3365': '201403975'
'3366': '201403983'
'3367': '201404005'
'3368': '201404006'
'3369': '201404008'
'3370': '201404009'
'3371': '201404010'
'3372': '201404013'
'3373': '201404014'
'3374': '201404015'
'3375': '201404016'
'3376': '201404026'
'3377': '201404035'
'3378': '201404036'
'3379': '201404038'
'3380': '201404042'
'3381': '201404043'
'3382': '201404088'
'3383': '201404093'
'3384': '201404114'
'3385': '201404115'
'3386': '201404116'
'3387': '201404136'
'3388': '201404162'
'3389': '201404163'
'3390': '201404168'
'3391': '201404170'
'3392': '201404198'
'3393': '201404200'
'3394': '201404206'
'3395': '201404215'
'3396': '201404239'
'3397': '201404271'
'3398': '201404287'
'3399': '201404293'
'3400': '201404299'
'3401': '201404316'
'3402': '201404327'
'3403': '201404340'
'3404': '201404342'
'3405': '201404343'
'3406': '201404377'
'3407': '201404400'
'3408': '201404403'
'3409': '201404406'
'3410': '201404415'
'3411': '201404417'
'3412': '201404424'
'3413': '201404425'
'3414': '201404426'
'3415': '201404441'
'3416': '201404454'
'3417': '201404456'
'3418': '201404458'
'3419': '201404483'
'3420': '201404486'
'3421': '201404487'
'3422': '201404488'
'3423': '201404497'
'3424': '201404513'
'3425': '201404527'
'3426': '201404528'
'3427': '201404537'
'3428': '201404538'
'3429': '201404539'
'3430': '201404541'
'3431': '201404566'
'3432': '201404569'
'3433': '201404590'
'3434': '201404591'
'3435': '201404592'
'3436': '201404593'
'3437': '201404595'
'3438': '201404596'
'3439': '201404597'
'3440': '201404633'
'3441': '201404634'
'3442': '201404644'
'3443': '201404654'
'3444': '201404664'
'3445': '201404677'
'3446': '201404678'
'3447': '201404689'
'3448': '201404690'
'3449': '201404696'
'3450': '201404714'
'3451': '201404715'
'3452': '201404716'
'3453': '201404731'
'3454': '201404732'
'3455': '201404733'
'3456': '201404741'
'3457': '201404786'
'3458': '201404800'
'3459': '201404801'
'3460': '201404824'
'3461': '201404836'
'3462': '201404857'
'3463': '201404863'
'3464': '201404889'
'3465': '201404928'
'3466': '201404929'
'3467': '201404930'
'3468': '201404976'
'3469': '201404981'
'3470': '201404983'
'3471': '201404986'
'3472': '201404993'
'3473': '201404994'
'3474': '201404995'
'3475': '201405000'
'3476': '201405003'
'3477': '201405010'
'3478': '201405014'
'3479': '201405026'
'3480': '201405027'
'3481': '201405033'
'3482': '201405042'
'3483': '201405043'
'3484': '201405044'
'3485': '201405045'
'3486': '201405046'
'3487': '201405050'
'3488': '201405051'
'3489': '201405052'
'3490': '201405063'
'3491': '201405073'
'3492': '201405078'
'3493': '201405079'
'3494': '201405080'
'3495': '201405094'
'3496': '201405096'
'3497': '201405100'
'3498': '201405103'
'3499': '201405104'
'3500': '201405107'
'3501': '201405111'
'3502': '201405112'
'3503': '201405113'
'3504': '201405114'
'3505': '201405130'
'3506': '201405131'
'3507': '201405132'
'3508': '201405175'
'3509': '201405177'
'3510': '201405232'
'3511': '201405233'
'3512': '201405262'
'3513': '201405282'
'3514': '201405290'
'3515': '201405347'
'3516': '201405387'
'3517': '201405424'
'3518': '201405425'
'3519': '201405434'
'3520': '201405436'
'3521': '201405439'
'3522': '201405449'
'3523': '201405450'
'3524': '201405454'
'3525': '201405464'
'3526': '201405467'
'3527': '201405468'
'3528': '201405469'
'3529': '201405482'
'3530': '201405502'
'3531': '201405503'
'3532': '201405504'
'3533': '201405519'
'3534': '201405520'
'3535': '201405531'
'3536': '201405585'
'3537': '201405586'
'3538': '201405598'
'3539': '201405610'
'3540': '201405615'
'3541': '201405693'
'3542': '201405708'
'3543': '201405813'
'3544': '201405839'
'3545': '201405840'
'3546': '201405841'
'3547': '201405844'
'3548': '201405857'
'3549': '201405865'
'3550': '201405882'
'3551': '201405921'
'3552': '201405930'
'3553': '201405956'
'3554': '201405971'
'3555': '201405995'
'3556': '201406030'
'3557': '201406032'
'3558': '201406033'
'3559': '201406046'
'3560': '201406102'
'3561': '201406125'
'3562': '201406126'
'3563': '201406140'
'3564': '201406156'
'3565': '201406161'
'3566': '201406170'
'3567': '201406171'
'3568': '201406179'
'3569': '201406315'
'3570': '201500025'
'3571': '201500081'
'3572': '201500082'
'3573': '201500083'
'3574': '201500099'
'3575': '201500101'
'3576': '201500107'
'3577': '201500114'
'3578': '201500127'
'3579': '201500145'
'3580': '201500149'
'3581': '201500168'
'3582': '201500169'
'3583': '201500171'
'3584': '201500179'
'3585': '201500233'
'3586': '201500293'
'3587': '201500313'
'3588': '201500364'
'3589': '201500365'
'3590': '201500366'
'3591': '201500367'
'3592': '201500372'
'3593': '201500387'
'3594': '201500401'
'3595': '201500429'
'3596': '201500450'
'3597': '201500458'
'3598': '201500483'
'3599': '201500509'
'3600': '201500516'
'3601': '201500517'
'3602': '201500521'
'3603': '201500522'
'3604': '201500530'
'3605': '201500554'
'3606': '201500645'
'3607': '201500650'
'3608': '201500694'
'3609': '201500722'
'3610': '201500725'
'3611': '201500728'
'3612': '201500731'
'3613': '201500852'
'3614': '201500856'
'3615': '201500902'
'3616': '201500903'
'3617': '201500929'
'3618': '201500948'
'3619': '201500959'
'3620': '201500963'
'3621': '201500964'
'3622': '201500965'
'3623': '201500966'
'3624': '201500967'
'3625': '201500969'
'3626': '201500972'
'3627': '201500982'
'3628': '201501074'
'3629': '201501083'
'3630': '201501128'
'3631': '201501207'
'3632': '201501216'
'3633': '201501270'
'3634': '201501274'
'3635': '201501307'
'3636': '201501322'
'3637': '201501325'
'3638': '201501350'
'3639': '201501441'
'3640': '201501446'
'3641': '201501476'
'3642': '201501488'
'3643': '201501494'
'3644': '201501500'
'3645': '201501501'
'3646': '201501503'
'3647': '201501539'
'3648': '201501540'
'3649': '201501548'
'3650': '201501550'
'3651': '201501555'
'3652': '201501556'
'3653': '201501557'
'3654': '201501569'
'3655': '201501592'
'3656': '201501595'
'3657': '201501605'
'3658': '201501608'
'3659': '201501609'
'3660': '201501610'
'3661': '201501611'
'3662': '201501612'
'3663': '201501622'
'3664': '201501624'
'3665': '201501644'
'3666': '201501647'
'3667': '201501657'
'3668': '201501658'
'3669': '201501663'
'3670': '201501675'
'3671': '201501676'
'3672': '201501677'
'3673': '201501680'
'3674': '201501687'
'3675': '201501689'
'3676': '201501690'
'3677': '201501693'
'3678': '201501698'
'3679': '201501702'
'3680': '201501703'
'3681': '201501714'
'3682': '201501723'
'3683': '201501724'
'3684': '201501725'
'3685': '201501730'
'3686': '201501733'
'3687': '201501741'
'3688': '201501744'
'3689': '201501778'
'3690': '201501783'
'3691': '201501790'
'3692': '201501791'
'3693': '201501811'
'3694': '201501814'
'3695': '201501853'
'3696': '201501854'
'3697': '201501865'
'3698': '201501874'
'3699': '201501885'
'3700': '201501891'
'3701': '201501925'
'3702': '201501928'
'3703': '201501929'
'3704': '201501931'
'3705': '201501944'
'3706': '201501965'
'3707': '201501978'
'3708': '201501979'
'3709': '201501980'
'3710': '201502217'
'3711': '201502267'
'3712': '201502291'
'3713': '201502292'
'3714': '201502307'
'3715': '201502310'
'3716': '201502330'
'3717': '201502345'
'3718': '201502363'
'3719': '201502371'
'3720': '201502379'
'3721': '201502394'
'3722': '201502397'
'3723': '201502417'
'3724': '201502419'
'3725': '201502420'
'3726': '201502421'
'3727': '201502450'
'3728': '201502453'
'3729': '201502454'
'3730': '201502471'
'3731': '201502552'
'3732': '201502560'
'3733': '201502562'
'3734': '201502564'
'3735': '201502570'
'3736': '201502606'
'3737': '201502607'
'3738': '201502608'
'3739': '201502609'
'3740': '201502631'
'3741': '201502638'
'3742': '201502639'
'3743': '201502640'
'3744': '201502644'
'3745': '201502645'
'3746': '201502654'
'3747': '201502656'
'3748': '201502658'
'3749': '201502659'
'3750': '201502670'
'3751': '201502671'
'3752': '201502677'
'3753': '201502678'
'3754': '201502689'
'3755': '201502690'
'3756': '201502703'
'3757': '201502704'
'3758': '201502705'
'3759': '201502708'
'3760': '201502709'
'3761': '201502710'
'3762': '201502712'
'3763': '201502714'
'3764': '201502717'
'3765': '201502718'
'3766': '201502727'
'3767': '201502742'
'3768': '201502761'
'3769': '201502766'
'3770': '201502770'
'3771': '201502773'
'3772': '201502774'
'3773': '201502775'
'3774': '201502785'
'3775': '201502786'
'3776': '201502790'
'3777': '201502793'
'3778': '201502813'
'3779': '201502862'
'3780': '201502874'
'3781': '201502890'
'3782': '201502917'
'3783': '201502924'
'3784': '201502926'
'3785': '201502952'
'3786': '201502969'
'3787': '201502979'
'3788': '201502986'
'3789': '201502987'
'3790': '201502994'
'3791': '201503006'
'3792': '201503025'
'3793': '201503026'
'3794': '201503027'
'3795': '201503029'
'3796': '201503051'
'3797': '201503142'
'3798': '201503144'
'3799': '201503191'
'3800': '201503193'
'3801': '201503212'
'3802': '201503218'
'3803': '201503219'
'3804': '201503234'
'3805': '201503249'
'3806': '201503251'
'3807': '201503253'
'3808': '201503288'
'3809': '201503289'
'3810': '201503290'
'3811': '201503293'
'3812': '201503295'
'3813': '201503302'
'3814': '201503335'
'3815': '201503336'
'3816': '201503468'
'3817': '201503475'
'3818': '201503482'
'3819': '201503488'
'3820': '201503490'
'3821': '201503506'
'3822': '201503516'
'3823': '201503520'
'3824': '201503532'
'3825': '201503554'
'3826': '201503575'
'3827': '201503600'
'3828': '201503641'
'3829': '201503645'
'3830': '201503647'
'3831': '201503683'
'3832': '201503771'
'3833': '201503772'
'3834': '201503783'
'3835': '201503839'
'3836': '201503865'
'3837': '201503876'
'3838': '201503920'
'3839': '201503921'
'3840': '201503948'
'3841': '201504046'
'3842': '201504047'
'3843': '201504070'
'3844': '201504073'
'3845': '201504084'
'3846': '201504106'
'3847': '201504115'
'3848': '201504119'
'3849': '201504124'
'3850': '201504135'
'3851': '201504137'
'3852': '201504179'
'3853': '201504199'
'3854': '201504202'
'3855': '201504203'
'3856': '201504205'
'3857': '201504225'
'3858': '201504231'
'3859': '201504249'
'3860': '201504255'
'3861': '201504259'
'3862': '201504303'
'3863': '201504313'
'3864': '201504314'
'3865': '201504315'
'3866': '201504317'
'3867': '201504318'
'3868': '201504319'
'3869': '201504320'
'3870': '201504321'
'3871': '201504326'
'3872': '201504332'
'3873': '201504333'
'3874': '201504334'
'3875': '201504493'
'3876': '201504494'
'3877': '201504495'
'3878': '201504519'
'3879': '201504729'
'3880': '201504732'
'3881': '201504763'
'3882': '201504788'
'3883': '201504988'
'3884': '201505072'
'3885': '201505074'
'3886': '201505091'
'3887': '201505114'
'3888': '201505137'
'3889': '201505162'
'3890': '201505163'
'3891': '201505164'
'3892': '201505167'
'3893': '201505168'
'3894': '201505169'
'3895': '201505192'
'3896': '201505198'
'3897': '201505199'
'3898': '201505259'
'3899': '201505342'
'3900': '201505350'
'3901': '201505365'
'3902': '201505441'
'3903': '201505522'
'3904': '201505607'
'3905': '201505621'
'3906': '201505725'
'3907': '201505737'
'3908': '201505796'
'3909': '201505797'
'3910': '201505830'
'3911': '201505842'
'3912': '201505843'
'3913': '201505901'
'3914': '201505903'
'3915': '201505904'
'3916': '201505920'
'3917': '201505921'
'3918': '201505935'
'3919': '201506120'
'3920': '201506121'
'3921': '201506122'
'3922': '201506136'
'3923': '201506143'
'3924': '201506288'
'3925': '201506398'
'3926': '201506402'
'3927': '201506405'
'3928': '201506481'
'3929': '201506489'
'3930': '201506507'
'3931': '201506569'
'3932': '201506583'
'3933': '201506589'
'3934': '201506623'
'3935': '201506624'
'3936': '201506639'
'3937': '201506649'
'3938': '201506677'
'3939': '201506684'
'3940': '201506685'
'3941': '201506686'
'3942': '201506729'
'3943': '201506734'
'3944': '201506748'
'3945': '201506766'
'3946': '201506769'
'3947': '201506783'
'3948': '201506785'
'3949': '201506786'
'3950': '201506795'
'3951': '201506797'
'3952': '201506799'
'3953': '201506815'
'3954': '201506816'
'3955': '201506835'
'3956': '201506843'
'3957': '201506848'
'3958': '201506851'
'3959': '201506856'
'3960': '201506857'
'3961': '201506873'
'3962': '201506874'
'3963': '201506894'
'3964': '201507014'
'3965': '201507045'
'3966': '201507071'
'3967': '201507072'
'3968': '201507074'
'3969': '201507075'
'3970': '201507134'
'3971': '201507139'
'3972': '201507146'
'3973': '201507195'
'3974': '201507196'
'3975': '201507261'
'3976': '201507270'
'3977': '201507272'
'3978': '201507273'
'3979': '201507288'
'3980': '201507291'
'3981': '201507380'
'3982': '201507381'
'3983': '201507390'
'3984': '201507399'
'3985': '201507405'
'3986': '201507411'
'3987': '201507412'
'3988': '201507413'
'3989': '201507418'
'3990': '201507420'
'3991': '201507424'
'3992': '201507426'
'3993': '201507452'
'3994': '201507455'
'3995': '201507458'
'3996': '201507464'
'3997': '201507465'
'3998': '201507470'
'3999': '201507477'
'4000': '201507478'
'4001': '201507501'
'4002': '201507502'
'4003': '201507517'
'4004': '201507521'
'4005': '201507527'
'4006': '201507582'
'4007': '201507644'
'4008': '201507645'
'4009': '201507681'
'4010': '201507682'
'4011': '201507715'
'4012': '201507736'
'4013': '201507737'
'4014': '201507738'
'4015': '201507762'
'4016': '201507763'
'4017': '201507764'
'4018': '201507776'
'4019': '201507778'
'4020': '201507785'
'4021': '201507786'
'4022': '201507806'
'4023': '201507839'
'4024': '201507840'
'4025': '201507875'
'4026': '201507876'
'4027': '201507887'
'4028': '201507888'
'4029': '201507891'
'4030': '201507895'
'4031': '201507896'
'4032': '201507897'
'4033': '201507898'
'4034': '201507899'
'4035': '201507944'
'4036': '201507948'
'4037': '201507964'
'4038': '201507965'
'4039': '201507966'
'4040': '201507967'
'4041': '201507968'
'4042': '201507984'
'4043': '201507985'
'4044': '201507986'
'4045': '201507997'
'4046': '201507998'
'4047': '201507999'
'4048': '201508003'
'4049': '201508004'
'4050': '201508018'
'4051': '201508019'
'4052': '201508116'
'4053': '201508117'
'4054': '201508139'
'4055': '201508213'
'4056': '201508214'
'4057': '201508231'
'4058': '201508232'
'4059': '201508272'
'4060': '201508281'
'4061': '201508325'
'4062': '201508372'
'4063': '201508378'
'4064': '201508403'
'4065': '201508407'
'4066': '201508411'
'4067': '201508428'
'4068': '201508435'
'4069': '201508436'
'4070': '201508440'
'4071': '201508441'
'4072': '201508453'
'4073': '201508461'
'4074': '201508476'
'4075': '201508477'
'4076': '201508486'
'4077': '201508519'
'4078': '201508531'
'4079': '201508532'
'4080': '201508542'
'4081': '201600113'
'4082': '201600170'
'4083': '201600171'
'4084': '201600172'
'4085': '201600191'
'4086': '201600217'
'4087': '201600251'
'4088': '201600274'
'4089': '201600275'
'4090': '201600300'
'4091': '201600301'
'4092': '201600339'
'4093': '201600340'
'4094': '201600356'
'4095': '201600358'
'4096': '201600359'
'4097': '201600361'
'4098': '201600391'
'4099': '201600396'
'4100': '201600407'
'4101': '201600425'
'4102': '201600426'
'4103': '201600436'
'4104': '201600445'
'4105': '201600498'
'4106': '201600500'
'4107': '201600505'
'4108': '201600507'
'4109': '201600515'
'4110': '201600516'
'4111': '201600519'
'4112': '201600568'
'4113': '201600569'
'4114': '201600572'
'4115': '201600573'
'4116': '201600574'
'4117': '201600584'
'4118': '201600585'
'4119': '201600601'
'4120': '201600695'
'4121': '201600696'
'4122': '201600731'
'4123': '201600777'
'4124': '201600782'
'4125': '201600828'
'4126': '201600839'
'4127': '201600853'
'4128': '201600903'
'4129': '201600933'
'4130': '201600938'
'4131': '201600941'
'4132': '201600974'
'4133': '201600978'
'4134': '201600986'
'4135': '201601004'
'4136': '201601006'
'4137': '201601009'
'4138': '201601030'
'4139': '201601045'
'4140': '201601046'
'4141': '201601047'
'4142': '201601050'
'4143': '201601053'
'4144': '201601071'
'4145': '201601081'
'4146': '201601082'
'4147': '201601112'
'4148': '201601113'
'4149': '201601117'
'4150': '201601119'
'4151': '201601149'
'4152': '201601191'
'4153': '201601195'
'4154': '201601196'
'4155': '201601197'
'4156': '201601201'
'4157': '201601202'
'4158': '201601203'
'4159': '201601219'
'4160': '201601220'
'4161': '201601221'
'4162': '201601231'
'4163': '201601232'
'4164': '201601233'
'4165': '201601234'
'4166': '201601249'
'4167': '201601347'
'4168': '201601506'
'4169': '201601522'
'4170': '201601537'
'4171': '201601552'
'4172': '201601553'
'4173': '201601554'
'4174': '201601555'
'4175': '201601643'
'4176': '201601680'
'4177': '201601683'
'4178': '201601700'
'4179': '201601702'
'4180': '201601703'
'4181': '201601721'
'4182': '201601725'
'4183': '201601728'
'4184': '201601731'
'4185': '201601734'
'4186': '201601737'
'4187': '201601742'
'4188': '201601743'
'4189': '201601761'
'4190': '201601762'
'4191': '201601896'
'4192': '201602068'
'4193': '201602080'
'4194': '201602081'
'4195': '201602085'
'4196': '201602122'
'4197': '201602235'
'4198': '201602239'
'4199': '201602274'
'4200': '201602286'
'4201': '201602294'
'4202': '201602295'
'4203': '201602296'
'4204': '201602307'
'4205': '201602314'
'4206': '201602328'
'4207': '201602339'
'4208': '201602349'
'4209': '201602351'
'4210': '201602362'
'4211': '201602383'
'4212': '201602409'
'4213': '201602421'
'4214': '201602427'
'4215': '201602432'
'4216': '201602491'
'4217': '201602500'
'4218': '201602513'
'4219': '201602518'
'4220': '201602533'
'4221': '201602561'
'4222': '201602610'
'4223': '201602619'
'4224': '201602662'
'4225': '201602702'
'4226': '201602708'
'4227': '201602739'
'4228': '201602766'
'4229': '201602784'
'4230': '201602785'
'4231': '201602888'
'4232': '201602889'
'4233': '201602891'
'4234': '201602908'
'4235': '201602909'
'4236': '201602926'
'4237': '201602930'
'4238': '201602957'
'4239': '201602959'
'4240': '201602965'
'4241': '201602983'
'4242': '201602996'
'4243': '201603025'
'4244': '201603032'
'4245': '201603033'
'4246': '201603057'
'4247': '201603058'
'4248': '201603059'
'4249': '201603073'
'4250': '201603074'
'4251': '201603076'
'4252': '201603084'
'4253': '201603097'
'4254': '201603104'
'4255': '201603127'
'4256': '201603130'
'4257': '201603137'
'4258': '201603147'
'4259': '201603150'
'4260': '201603151'
'4261': '201603152'
'4262': '201603161'
'4263': '201603173'
'4264': '201603178'
'4265': '201603226'
'4266': '201603227'
'4267': '201603228'
'4268': '201603229'
'4269': '201603298'
'4270': '201603314'
'4271': '201603315'
'4272': '201603316'
'4273': '201603317'
'4274': '201603320'
'4275': '201603340'
'4276': '201603363'
'4277': '201603369'
'4278': '201603370'
'4279': '201603416'
'4280': '201603432'
'4281': '201603454'
'4282': '201603455'
'4283': '201603462'
'4284': '201603468'
'4285': '201603493'
'4286': '201603497'
'4287': '201603514'
'4288': '201603530'
'4289': '201603552'
'4290': '201603608'
'4291': '201603615'
'4292': '201603628'
'4293': '201603633'
'4294': '201603644'
'4295': '201603663'
'4296': '201603669'
'4297': '201603670'
'4298': '201603681'
'4299': '201603682'
'4300': '201603683'
'4301': '201603692'
'4302': '201603728'
'4303': '201603729'
'4304': '201603775'
'4305': '201603787'
'4306': '201603794'
'4307': '201603795'
'4308': '201603796'
'4309': '201603807'
'4310': '201603883'
'4311': '201603890'
'4312': '201603892'
'4313': '201603894'
'4314': '201603895'
'4315': '201603899'
'4316': '201603919'
'4317': '201603920'
'4318': '201603929'
'4319': '201603931'
'4320': '201603988'
'4321': '201603989'
'4322': '201603995'
'4323': '201604010'
'4324': '201604018'
'4325': '201604034'
'4326': '201604054'
'4327': '201604060'
'4328': '201604075'
'4329': '201604076'
'4330': '201604117'
'4331': '201604118'
'4332': '201604119'
'4333': '201604147'
'4334': '201604148'
'4335': '201604152'
'4336': '201604178'
'4337': '201604202'
'4338': '201604209'
'4339': '201604247'
'4340': '201604290'
'4341': '201604348'
'4342': '201604367'
'4343': '201604375'
'4344': '201604409'
'4345': '201604410'
'4346': '201604411'
'4347': '201604429'
'4348': '201604461'
'4349': '201604485'
'4350': '201604621'
'4351': '201604622'
'4352': '201604623'
'4353': '201604628'
'4354': '201604631'
'4355': '201604632'
'4356': '201604669'
'4357': '201604677'
'4358': '201604678'
'4359': '201604687'
'4360': '201604696'
'4361': '201604705'
'4362': '201604823'
'4363': '201604942'
'4364': '201605007'
'4365': '201605011'
'4366': '201605135'
'4367': '201605175'
'4368': '201605323'
'4369': '201605325'
'4370': '201605340'
'4371': '201605497'
'4372': '201605721'
'4373': '201605722'
'4374': '201605723'
'4375': '201605749'
'4376': '201605790'
'4377': '201605791'
'4378': '201605792'
'4379': '201605801'
'4380': '201605802'
'4381': '201605803'
'4382': '201605804'
'4383': '201605805'
'4384': '201605806'
'4385': '201605819'
'4386': '201605820'
'4387': '201605821'
'4388': '201605845'
'4389': '201605851'
'4390': '201605853'
'4391': '201605860'
'4392': '201605978'
'4393': '201606046'
'4394': '201606047'
'4395': '201606048'
'4396': '201606049'
'4397': '201606050'
'4398': '201606051'
'4399': '201606053'
'4400': '201606081'
'4401': '201606123'
'4402': '201606189'
'4403': '201606254'
'4404': '201606256'
'4405': '201606264'
'4406': '201606306'
'4407': '201606307'
'4408': '201606380'
'4409': '201606381'
'4410': '201606391'
'4411': '201606394'
'4412': '201606395'
'4413': '201606401'
'4414': '201606419'
'4415': '201606563'
'4416': '201606564'
'4417': '201606593'
'4418': '201606695'
'4419': '201606770'
'4420': '201606772'
'4421': '201606795'
'4422': '201606799'
'4423': '201606818'
'4424': '201606829'
'4425': '201606935'
'4426': '201607019'
'4427': '201607041'
'4428': '201607042'
'4429': '201607043'
'4430': '201607093'
'4431': '201607205'
'4432': '201607206'
'4433': '201607211'
'4434': '201607298'
'4435': '201607299'
'4436': '201700144'
'4437': '201700156'
'4438': '201700243'
'4439': '201700244'
'4440': '201700254'
'4441': '201700377'
'4442': '201700390'
'4443': '201700402'
'4444': '201700404'
'4445': '201700418'
'4446': '201700452'
'4447': '201700463'
'4448': '201700518'
'4449': '201700619'
'4450': '201700620'
'4451': '201700621'
'4452': '201700623'
'4453': '201700684'
'4454': '201700730'
'4455': '201700836'
'4456': '201700897'
'4457': '201700900'
'4458': '201700904'
'4459': '201700906'
'4460': '201700921'
'4461': '201701040'
'4462': '201701044'
'4463': '201701157'
'4464': '201701163'
'4465': '201701169'
'4466': '201701170'
'4467': '201701194'
'4468': '201701196'
'4469': '201701198'
'4470': '201701233'
'4471': '201701288'
'4472': '201701289'
'4473': '201701296'
'4474': '201701303'
'4475': '201701339'
'4476': '201701340'
'4477': '201701380'
'4478': '201701414'
'4479': '201701415'
'4480': '201701435'
'4481': '201701447'
'4482': '201701456'
'4483': '201701458'
'4484': '201701471'
'4485': '201701478'
'4486': '201701479'
'4487': '201701501'
'4488': '201701538'
'4489': '201701544'
'4490': '201701545'
'4491': '201701546'
'4492': '201701551'
'4493': '201701574'
'4494': '201701611'
'4495': '201701717'
'4496': '201701749'
'4497': '201701750'
'4498': '201701865'
'4499': '201701878'
'4500': '201701879'
'4501': '201701882'
'4502': '201701960'
'4503': '201702166'
'4504': '201702167'
'4505': '201702168'
'4506': '201702204'
'4507': '201702236'
'4508': '201702237'
'4509': '201702238'
'4510': '201702244'
'4511': '201702246'
'4512': '201702256'
'4513': '201702259'
'4514': '201702281'
'4515': '201702386'
'4516': '201702387'
'4517': '201702390'
'4518': '201702397'
'4519': '201702398'
'4520': '201702399'
'4521': '201702401'
'4522': '201702402'
'4523': '201702403'
'4524': '201702404'
'4525': '201702485'
'4526': '201702492'
'4527': '201702493'
'4528': '201702508'
'4529': '201702526'
'4530': '201702527'
'4531': '201702528'
'4532': '201702541'
'4533': '201702542'
'4534': '201702577'
'4535': '201702610'
'4536': '201702624'
'4537': '201702688'
'4538': '201702735'
'4539': '201702768'
'4540': '201702781'
'4541': '201702782'
'4542': '201702783'
'4543': '201702795'
'4544': '201703991'
'4545': '201703995'
'4546': '201704017'
'4547': '201704542'
'4548': '201704726'
'4549': '201704753'
'4550': '201704761'
'4551': '201704762'
'4552': '201704782'
'4553': '201704786'
'4554': '201704833'
'4555': '201704879'
'4556': '201705029'
'4557': '201705044'
'4558': '201705174'
'4559': '201705175'
'4560': '201705206'
'4561': '201705217'
'4562': '201705219'
'4563': '201705220'
'4564': '201705221'
'4565': '201705254'
'4566': '201705255'
'4567': '201705467'
'4568': '201705468'
'4569': '201705487'
'4570': '201705488'
'4571': '201705491'
'4572': '201705673'
'4573': '201705684'
'4574': '201705685'
'4575': '201705686'
'4576': '201705687'
'4577': '201705688'
'4578': '201705689'
'4579': '201705789'
'4580': '201705790'
'4581': '201705809'
'4582': '201705811'
'4583': '201706137'
'4584': '201706144'
'4585': '201706188'
'4586': '201706209'
'4587': '201706210'
'4588': '201706211'
'4589': '201706212'
'4590': '201706221'
'4591': '201706237'
'4592': '201706344'
'4593': '201706345'
'4594': '201706353'
'4595': '201706355'
'4596': '201706473'
'4597': '201706474'
'4598': '201706555'
'4599': '201706556'
'4600': '201706557'
'4601': '201706560'
'4602': '201706564'
'4603': '201706565'
'4604': '201706571'
'4605': '201706572'
'4606': '201706573'
'4607': '201706574'
'4608': '201706575'
'4609': '201706583'
'4610': '201706584'
'4611': '201706641'
'4612': '201706642'
'4613': '201706690'
'4614': '201706734'
'4615': '201706749'
'4616': '201706750'
'4617': '201706753'
'4618': '201706786'
'4619': '201706787'
'4620': '201706788'
'4621': '201706789'
'4622': '201706790'
'4623': '201706791'
'4624': '201706798'
'4625': '201706814'
'4626': '201706821'
'4627': '201706825'
'4628': '201706845'
'4629': '201706853'
'4630': '201706858'
'4631': '201706866'
'4632': '201706889'
'4633': '201706890'
'4634': '201706899'
'4635': '201706900'
'4636': '201706969'
'4637': '201707021'
'4638': '201707062'
'4639': '201707098'
'4640': '201707167'
'4641': '201707212'
'4642': '201707216'
'4643': '201707217'
'4644': '201707234'
'4645': '201707244'
'4646': '201707245'
'4647': '201707248'
'4648': '201707295'
'4649': '201707296'
'4650': '201707302'
'4651': '201707311'
'4652': '201707327'
'4653': '201707350'
'4654': '201707352'
'4655': '201707392'
'4656': '201707443'
'4657': '201707526'
'4658': '201707534'
'4659': '201707535'
'4660': '201707539'
'4661': '201707601'
'4662': '201707616'
'4663': '201707639'
'4664': '201707644'
'4665': '201707649'
'4666': '201707653'
'4667': '201707658'
'4668': '201707659'
'4669': '201707660'
'4670': '201707669'
'4671': '201707674'
'4672': '201707675'
'4673': '201707837'
'4674': '201707838'
'4675': '201707853'
'4676': '201707860'
'4677': '201707937'
'4678': '201707938'
'4679': '201707939'
'4680': '201707944'
'4681': '201708018'
'4682': '201708021'
'4683': '201708034'
'4684': '201708043'
'4685': '201708044'
'4686': '201708045'
'4687': '201708055'
'4688': '201708069'
'4689': '201708080'
'4690': '201708083'
'4691': '201708085'
'4692': '201708087'
'4693': '201708091'
'4694': '201708095'
'4695': '201708096'
'4696': '201708110'
'4697': '201708131'
'4698': '201708142'
'4699': '201708144'
'4700': '201708159'
'4701': '201708184'
'4702': '201708188'
'4703': '201708197'
'4704': '201708198'
'4705': '201708199'
'4706': '201708204'
'4707': '201708219'
'4708': '201708227'
'4709': '201708233'
'4710': '201708277'
'4711': '201708280'
'4712': '201708281'
'4713': '201708331'
'4714': '201708341'
'4715': '201708362'
'4716': '201708367'
'4717': '201708368'
'4718': '201708370'
'4719': '201708380'
'4720': '201708416'
'4721': '201708419'
'4722': '201708420'
'4723': '201708422'
'4724': '201708434'
'4725': '201708449'
'4726': '201708450'
'4727': '201708456'
'4728': '201708459'
'4729': '201708465'
'4730': '201708466'
'4731': '201708476'
'4732': '201708478'
'4733': '201708482'
'4734': '201708490'
'4735': '201708492'
'4736': '201708494'
'4737': '201708498'
'4738': '201708499'
'4739': '201708532'
'4740': '201708533'
'4741': '201708542'
'4742': '201708549'
'4743': '201708551'
'4744': '201708555'
'4745': '201708559'
'4746': '201708569'
'4747': '201708570'
'4748': '201708571'
'4749': '201708577'
'4750': '201708578'
'4751': '201708579'
'4752': '201708581'
'4753': '201708582'
'4754': '201708583'
'4755': '201708591'
'4756': '201708594'
'4757': '201708597'
'4758': '201708604'
'4759': '201708606'
'4760': '201708609'
'4761': '201708615'
'4762': '201708620'
'4763': '201708621'
'4764': '201708622'
'4765': '201708628'
'4766': '201708633'
'4767': '201708634'
'4768': '201708635'
'4769': '201708636'
'4770': '201708642'
'4771': '201800016'
'4772': '201800048'
'4773': '201800067'
'4774': '201800087'
'4775': '201800100'
'4776': '201800123'
'4777': '201800128'
'4778': '201800133'
'4779': '201800142'
'4780': '201800184'
'4781': '201800191'
'4782': '201800217'
'4783': '201800228'
'4784': '201800229'
'4785': '201800231'
'4786': '201800239'
'4787': '201800286'
'4788': '201800298'
'4789': '201800299'
'4790': '201800300'
'4791': '201800302'
'4792': '201800304'
'4793': '201800305'
'4794': '201800306'
'4795': '201800307'
'4796': '201800308'
'4797': '201800313'
'4798': '201800314'
'4799': '201800317'
'4800': '201800318'
'4801': '201800319'
'4802': '201800320'
'4803': '201800344'
'4804': '201800496'
'4805': '201800499'
'4806': '201800500'
'4807': '201800503'
'4808': '201800504'
'4809': '201800506'
'4810': '201800507'
'4811': '201800511'
'4812': '201800512'
'4813': '201800513'
'4814': '201800530'
'4815': '201800531'
'4816': '201800546'
'4817': '201800567'
'4818': '201800568'
'4819': '201800576'
'4820': '201800600'
'4821': '201800601'
'4822': '201800609'
'4823': '201800638'
'4824': '201800639'
'4825': '201800640'
'4826': '201800648'
'4827': '201800668'
'4828': '201800675'
'4829': '201800701'
'4830': '201800718'
'4831': '201800764'
'4832': '201800767'
'4833': '201800768'
'4834': '201800791'
'4835': '201800807'
'4836': '201800810'
'4837': '201800811'
'4838': '201800813'
'4839': '201800836'
'4840': '201800860'
'4841': '201800888'
'4842': '201800889'
'4843': '201800907'
'4844': '201800910'
'4845': '201800924'
'4846': '201800930'
'4847': '201800935'
'4848': '201800936'
'4849': '201800937'
'4850': '201800950'
'4851': '201800964'
'4852': '201800992'
'4853': '201801001'
'4854': '201801002'
'4855': '201801008'
'4856': '201801071'
'4857': '201801073'
'4858': '201801087'
'4859': '201801145'
'4860': '201801152'
'4861': '201801153'
'4862': '201801175'
'4863': '201801179'
'4864': '201801224'
'4865': '201801267'
'4866': '201801271'
'4867': '201801291'
'4868': '201801346'
'4869': '201801353'
'4870': '201801354'
'4871': '201801386'
'4872': '201801388'
'4873': '201801415'
'4874': '201801418'
'4875': '201801429'
'4876': '201801430'
'4877': '201801431'
'4878': '201801432'
'4879': '201801433'
'4880': '201801516'
'4881': '201801580'
'4882': '201801617'
'4883': '201801680'
'4884': '201801687'
'4885': '201801715'
'4886': '201801725'
'4887': '201801763'
'4888': '201801764'
'4889': '201801765'
'4890': '201801963'
'4891': '201802050'
'4892': '201802103'
'4893': '201802129'
'4894': '201802146'
'4895': '201802226'
'4896': '201802227'
'4897': '201802228'
'4898': '201802255'
'4899': '201802335'
'4900': '201802339'
'4901': '201802378'
'4902': '201802463'
'4903': '201802579'
'4904': '201802586'
'4905': '201802641'
'4906': '201802652'
'4907': '201802697'
'4908': '201802700'
'4909': '201802737'
'4910': '201802751'
'4911': '201802763'
'4912': '201802764'
'4913': '201802818'
'4914': '201802894'
'4915': '201802909'
'4916': '201803000'
'4917': '201803017'
'4918': '201803041'
'4919': '201803060'
'4920': '201803093'
'4921': '201803122'
'4922': '201803136'
'4923': '201803164'
'4924': '201803165'
'4925': '201803166'
'4926': '201803189'
'4927': '201803233'
'4928': '201803298'
'4929': '201803304'
'4930': '201803367'
'4931': '201803379'
'4932': '201803380'
'4933': '201803500'
'4934': '201803502'
'4935': '201803516'
'4936': '201803529'
'4937': '201803530'
'4938': '201803606'
'4939': '201803649'
'4940': '201803666'
'4941': '201803667'
'4942': '201803677'
'4943': '201803699'
'4944': '201803704'
'4945': '201803731'
'4946': '201803777'
'4947': '201803799'
'4948': '201803815'
'4949': '201803824'
'4950': '201803844'
'4951': '201803853'
'4952': '201803880'
'4953': '201803881'
'4954': '201803887'
'4955': '201803888'
'4956': '201803889'
'4957': '201803896'
'4958': '201803899'
'4959': '201804023'
'4960': '201804024'
'4961': '201804149'
'4962': '201804162'
'4963': '201804237'
'4964': '201804271'
'4965': '201804306'
'4966': '201804307'
'4967': '201804308'
'4968': '201804313'
'4969': '201804352'
'4970': '201804441'
'4971': '201804442'
'4972': '201804443'
'4973': '201804479'
'4974': '201804518'
'4975': '201804572'
'4976': '201804690'
'4977': '201804758'
'4978': '201804760'
'4979': '201804826'
'4980': '201804896'
'4981': '201804988'
'4982': '201805048'
'4983': '201805182'
'4984': '201805183'
'4985': '201805293'
'4986': '201805294'
'4987': '201805295'
'4988': '201805337'
'4989': '201805338'
'4990': '201805340'
'4991': '201805341'
'4992': '201805342'
'4993': '201805351'
'4994': '201900011'
'4995': '201900112'
'4996': '201900142'
'4997': '201900143'
'4998': '201900146'
'4999': '201900219'
'5000': '201900220'
'5001': '201900226'
'5002': '201900267'
'5003': '201900303'
'5004': '201900304'
'5005': '201900321'
'5006': '201900595'
'5007': '201900606'
'5008': '201900630'
'5009': '201900645'
'5010': '201900672'
'5011': '201900673'
'5012': '201900688'
'5013': '201900689'
'5014': '201900713'
'5015': '201900734'
'5016': '201900756'
'5017': '201900766'
'5018': '201900875'
'5019': '201900973'
'5020': '201901014'
'5021': '201901301'
'5022': '201901302'
'5023': '201901457'
'5024': '201901642'
'5025': '201901787'
'5026': '201901858'
'5027': '201901859'
'5028': '201901918'
'5029': '201901919'
'5030': '201902042'
'5031': '201902055'
'5032': '201902071'
'5033': '201902113'
'5034': '201902114'
'5035': '201902200'
'5036': '201902470'
'5037': '201902565'
'5038': '201902754'
'5039': '201902773'
'5040': '201902819'
'5041': '201903304'
'5042': '201903597'
'5043': '201903815'
'5044': '201904168'
'5045': '201904204'
'5046': '201904276'
'5047': '201904421'
'5048': '201904442'
'5049': '201904444'
'5050': '201904825'
'5051': '201905085'
'5052': '201905306'
'5053': '201905460'
'5054': '201905619'
'5055': '201905927'
'5056': '201906789'
'5057': '201906790'
'5058': '201906791'
'5059': '201906797'
'5060': '201906798'
'5061': '201906879'
'5062': '201906921'
'5063': '201906965'
'5064': '201906970'
'5065': '201906992'
'5066': '201907134'
'5067': '201907188'
'5068': '201907198'
'5069': '201907289'
'5070': '201907290'
'5071': '201907386'
'5072': '201907387'
'5073': '201907465'
'5074': '201907466'
'5075': '201907531'
'5076': '201907532'
'5077': '201907533'
'5078': '201907576'
'5079': '201907581'
'5080': '201907607'
'5081': '201907624'
'5082': '201907625'
'5083': '201907803'
'5084': '201908003'
'5085': '201908132'
'5086': '201908146'
'5087': '201908339'
- name: image_id
dtype: int64
- name: width
dtype: int64
- name: height
dtype: int64
- name: objects
struct:
- name: area
sequence: int64
- name: bbox
sequence:
sequence: int64
- name: category
sequence: int64
- name: id
sequence: int64
splits:
- name: train
num_bytes: 9633200597.0
num_examples: 20620
download_size: 8789960710
dataset_size: 9633200597.0
- config_name: origin_image
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '196000001'
'1': '196200043'
'2': '196300001'
'3': '196400099'
'4': '196500004'
'5': '197000037'
'6': '197000049'
'7': '197000050'
'8': '197000053'
'9': '197000079'
'10': '197000102'
'11': '197100097'
'12': '197300021'
'13': '197400039'
'14': '197400059'
'15': '197500015'
'16': '197500016'
'17': '197500285'
'18': '197500541'
'19': '197500654'
'20': '197600065'
'21': '197700025'
'22': '197700049'
'23': '197700120'
'24': '197800027'
'25': '197900544'
'26': '197900575'
'27': '198000054'
'28': '198000158'
'29': '198000160'
'30': '198000170'
'31': '198000572'
'32': '198100012'
'33': '198100015'
'34': '198100119'
'35': '198100257'
'36': '198100428'
'37': '198200048'
'38': '198200049'
'39': '198200323'
'40': '198200325'
'41': '198300064'
'42': '198300065'
'43': '198300096'
'44': '198300142'
'45': '198300174'
'46': '198300343'
'47': '198300393'
'48': '198300476'
'49': '198300605'
'50': '198301052'
'51': '198400185'
'52': '198400399'
'53': '198400475'
'54': '198401033'
'55': '198401161'
'56': '198500041'
'57': '198500049'
'58': '198500050'
'59': '198500125'
'60': '198500241'
'61': '198500245'
'62': '198500384'
'63': '198500515'
'64': '198500612'
'65': '198500634'
'66': '198500715'
'67': '198500718'
'68': '198501126'
'69': '198501220'
'70': '198501325'
'71': '198501456'
'72': '198501820'
'73': '198600058'
'74': '198600114'
'75': '198600161'
'76': '198600235'
'77': '198600470'
'78': '198600661'
'79': '198600674'
'80': '198601102'
'81': '198601223'
'82': '198601878'
'83': '198700096'
'84': '198700405'
'85': '198700476'
'86': '198700477'
'87': '198700535'
'88': '198700537'
'89': '198700667'
'90': '198700731'
'91': '198700733'
'92': '198700736'
'93': '198700737'
'94': '198701132'
'95': '198701478'
'96': '198701520'
'97': '198701523'
'98': '198701583'
'99': '198800150'
'100': '198800153'
'101': '198800154'
'102': '198800445'
'103': '198800619'
'104': '198800622'
'105': '198800788'
'106': '198800791'
'107': '198800901'
'108': '198800902'
'109': '198800911'
'110': '198801052'
'111': '198801525'
'112': '198801531'
'113': '198801937'
'114': '198802273'
'115': '198802349'
'116': '198802355'
'117': '198900123'
'118': '198900125'
'119': '198900129'
'120': '198900223'
'121': '198900263'
'122': '198900630'
'123': '198900711'
'124': '198900817'
'125': '198900881'
'126': '198900993'
'127': '198901021'
'128': '198901206'
'129': '198901207'
'130': '198902026'
'131': '198902101'
'132': '198902108'
'133': '198902158'
'134': '198902844'
'135': '199000074'
'136': '199000165'
'137': '199000352'
'138': '199000530'
'139': '199000568'
'140': '199000820'
'141': '199000983'
'142': '199001080'
'143': '199001166'
'144': '199001735'
'145': '199001917'
'146': '199001919'
'147': '199001973'
'148': '199002022'
'149': '199002349'
'150': '199100118'
'151': '199100476'
'152': '199100626'
'153': '199100628'
'154': '199100632'
'155': '199100636'
'156': '199100838'
'157': '199100923'
'158': '199101034'
'159': '199101166'
'160': '199101229'
'161': '199101291'
'162': '199101298'
'163': '199101995'
'164': '199102137'
'165': '199102138'
'166': '199102384'
'167': '199102409'
'168': '199102447'
'169': '199102449'
'170': '199102476'
'171': '199102956'
'172': '199103287'
'173': '199200232'
'174': '199200236'
'175': '199200261'
'176': '199200476'
'177': '199200490'
'178': '199200629'
'179': '199200633'
'180': '199200653'
'181': '199200792'
'182': '199200869'
'183': '199200870'
'184': '199200876'
'185': '199201002'
'186': '199201006'
'187': '199201045'
'188': '199201046'
'189': '199201154'
'190': '199201155'
'191': '199201214'
'192': '199202074'
'193': '199202334'
'194': '199202495'
'195': '199202562'
'196': '199202565'
'197': '199202572'
'198': '199202944'
'199': '199202946'
'200': '199203272'
'201': '199203279'
'202': '199203487'
'203': '199203520'
'204': '199300175'
'205': '199300208'
'206': '199300225'
'207': '199300319'
'208': '199300325'
'209': '199300516'
'210': '199300692'
'211': '199300758'
'212': '199300815'
'213': '199300927'
'214': '199300928'
'215': '199300997'
'216': '199301111'
'217': '199301180'
'218': '199301193'
'219': '199301673'
'220': '199302061'
'221': '199302106'
'222': '199302107'
'223': '199302108'
'224': '199302175'
'225': '199302198'
'226': '199302663'
'227': '199302666'
'228': '199302671'
'229': '199400095'
'230': '199400114'
'231': '199400191'
'232': '199400362'
'233': '199400392'
'234': '199400488'
'235': '199400494'
'236': '199400513'
'237': '199400514'
'238': '199400540'
'239': '199400625'
'240': '199400630'
'241': '199400635'
'242': '199400682'
'243': '199400695'
'244': '199400696'
'245': '199400704'
'246': '199400710'
'247': '199400823'
'248': '199400868'
'249': '199400905'
'250': '199400946'
'251': '199400948'
'252': '199401165'
'253': '199401504'
'254': '199401625'
'255': '199401734'
'256': '199401754'
'257': '199401768'
'258': '199401770'
'259': '199401998'
'260': '199500043'
'261': '199500183'
'262': '199500439'
'263': '199500466'
'264': '199500470'
'265': '199500549'
'266': '199500630'
'267': '199500842'
'268': '199500845'
'269': '199500850'
'270': '199500959'
'271': '199501023'
'272': '199501072'
'273': '199501073'
'274': '199501234'
'275': '199501564'
'276': '199501728'
'277': '199501734'
'278': '199501735'
'279': '199501736'
'280': '199501738'
'281': '199501743'
'282': '199501748'
'283': '199501749'
'284': '199501905'
'285': '199501906'
'286': '199502137'
'287': '199502153'
'288': '199502155'
'289': '199502167'
'290': '199502192'
'291': '199502215'
'292': '199502217'
'293': '199502223'
'294': '199502224'
'295': '199502225'
'296': '199502428'
'297': '199502575'
'298': '199502582'
'299': '199502585'
'300': '199504100'
'301': '199504101'
'302': '199504272'
'303': '199504273'
'304': '199504276'
'305': '199504332'
'306': '199504367'
'307': '199600151'
'308': '199600158'
'309': '199600246'
'310': '199600267'
'311': '199600367'
'312': '199600597'
'313': '199600616'
'314': '199600625'
'315': '199600977'
'316': '199601062'
'317': '199601139'
'318': '199601167'
'319': '199601484'
'320': '199601520'
'321': '199601533'
'322': '199602013'
'323': '199602015'
'324': '199602019'
'325': '199602021'
'326': '199602022'
'327': '199602023'
'328': '199602029'
'329': '199602031'
'330': '199602290'
'331': '199602404'
'332': '199602533'
'333': '199602566'
'334': '199602745'
'335': '199603160'
'336': '199603453'
'337': '199603455'
'338': '199603458'
'339': '199604928'
'340': '199604979'
'341': '199605127'
'342': '199700160'
'343': '199700176'
'344': '199700389'
'345': '199700584'
'346': '199700734'
'347': '199700738'
'348': '199700745'
'349': '199700798'
'350': '199700808'
'351': '199700809'
'352': '199700833'
'353': '199700840'
'354': '199700883'
'355': '199700887'
'356': '199700909'
'357': '199700916'
'358': '199700918'
'359': '199700919'
'360': '199700978'
'361': '199701009'
'362': '199701010'
'363': '199701011'
'364': '199701012'
'365': '199701033'
'366': '199701054'
'367': '199701063'
'368': '199701064'
'369': '199701065'
'370': '199701131'
'371': '199701225'
'372': '199701226'
'373': '199701227'
'374': '199701265'
'375': '199701271'
'376': '199701295'
'377': '199701385'
'378': '199701392'
'379': '199701393'
'380': '199701398'
'381': '199701401'
'382': '199701408'
'383': '199701414'
'384': '199701416'
'385': '199701432'
'386': '199701437'
'387': '199701525'
'388': '199701668'
'389': '199701845'
'390': '199702105'
'391': '199702106'
'392': '199702117'
'393': '199702118'
'394': '199702126'
'395': '199702163'
'396': '199702164'
'397': '199702165'
'398': '199702177'
'399': '199702351'
'400': '199702352'
'401': '199702380'
'402': '199702412'
'403': '199702566'
'404': '199702588'
'405': '199702829'
'406': '199702839'
'407': '199702846'
'408': '199702918'
'409': '199702953'
'410': '199703049'
'411': '199703126'
'412': '199703129'
'413': '199703432'
'414': '199703566'
'415': '199703570'
'416': '199704687'
'417': '199800085'
'418': '199800086'
'419': '199800142'
'420': '199800471'
'421': '199800503'
'422': '199800519'
'423': '199800751'
'424': '199800785'
'425': '199800903'
'426': '199800976'
'427': '199800979'
'428': '199800990'
'429': '199801011'
'430': '199801012'
'431': '199801099'
'432': '199801158'
'433': '199801165'
'434': '199801260'
'435': '199801263'
'436': '199801266'
'437': '199801267'
'438': '199801290'
'439': '199801292'
'440': '199801298'
'441': '199801524'
'442': '199801625'
'443': '199801634'
'444': '199801640'
'445': '199801649'
'446': '199802092'
'447': '199802093'
'448': '199802138'
'449': '199802175'
'450': '199802288'
'451': '199802289'
'452': '199802611'
'453': '199802620'
'454': '199802675'
'455': '199802769'
'456': '199803014'
'457': '199803062'
'458': '199803425'
'459': '199803684'
'460': '199803957'
'461': '199803958'
'462': '199806739'
'463': '199806820'
'464': '199806914'
'465': '199806978'
'466': '199900114'
'467': '199900120'
'468': '199900163'
'469': '199900203'
'470': '199900204'
'471': '199900351'
'472': '199900554'
'473': '199900672'
'474': '199900832'
'475': '199900834'
'476': '199900915'
'477': '199901076'
'478': '199901116'
'479': '199901211'
'480': '199901215'
'481': '199901218'
'482': '199901248'
'483': '199901258'
'484': '199901558'
'485': '199901692'
'486': '199901718'
'487': '199901759'
'488': '199902270'
'489': '199902295'
'490': '199902440'
'491': '199902448'
'492': '199902449'
'493': '199902459'
'494': '199902661'
'495': '199903179'
'496': '199903219'
'497': '199903222'
'498': '199903365'
'499': '199903836'
'500': '199903837'
'501': '199906840'
'502': '199907419'
'503': '200000209'
'504': '200000210'
'505': '200000220'
'506': '200000223'
'507': '200000224'
'508': '200000417'
'509': '200000559'
'510': '200000569'
'511': '200000601'
'512': '200000740'
'513': '200000919'
'514': '200000930'
'515': '200000932'
'516': '200000933'
'517': '200000934'
'518': '200000935'
'519': '200001172'
'520': '200001184'
'521': '200001186'
'522': '200001190'
'523': '200001275'
'524': '200001430'
'525': '200001578'
'526': '200001604'
'527': '200001858'
'528': '200002121'
'529': '200002422'
'530': '200002738'
'531': '200002760'
'532': '200002776'
'533': '200002787'
'534': '200002882'
'535': '200002887'
'536': '200002949'
'537': '200003080'
'538': '200003085'
'539': '200003086'
'540': '200003109'
'541': '200003111'
'542': '200003403'
'543': '200003442'
'544': '200003541'
'545': '200003560'
'546': '200003792'
'547': '200003994'
'548': '200003999'
'549': '200004038'
'550': '200004243'
'551': '200004640'
'552': '200004643'
'553': '200005087'
'554': '200005100'
'555': '200008568'
'556': '200008571'
'557': '200008952'
'558': '200009059'
'559': '200009061'
'560': '200009062'
'561': '200009063'
'562': '200009333'
'563': '200009496'
'564': '200009524'
'565': '200009634'
'566': '200009708'
'567': '200100102'
'568': '200100264'
'569': '200100383'
'570': '200100424'
'571': '200100429'
'572': '200100602'
'573': '200100614'
'574': '200100671'
'575': '200100725'
'576': '200100827'
'577': '200100858'
'578': '200100960'
'579': '200100997'
'580': '200101016'
'581': '200101017'
'582': '200101018'
'583': '200101022'
'584': '200101103'
'585': '200101123'
'586': '200101150'
'587': '200101162'
'588': '200101179'
'589': '200101249'
'590': '200101393'
'591': '200101407'
'592': '200101428'
'593': '200101451'
'594': '200101472'
'595': '200101633'
'596': '200101635'
'597': '200102266'
'598': '200102615'
'599': '200102622'
'600': '200102623'
'601': '200102644'
'602': '200102802'
'603': '200102845'
'604': '200102846'
'605': '200102887'
'606': '200102909'
'607': '200102920'
'608': '200102931'
'609': '200102998'
'610': '200103187'
'611': '200103882'
'612': '200103884'
'613': '200108384'
'614': '200108800'
'615': '200108817'
'616': '200108822'
'617': '200109139'
'618': '200109233'
'619': '200109769'
'620': '200110053'
'621': '200110059'
'622': '200200143'
'623': '200200152'
'624': '200200158'
'625': '200200169'
'626': '200200173'
'627': '200200201'
'628': '200200234'
'629': '200200252'
'630': '200200375'
'631': '200200377'
'632': '200200486'
'633': '200200502'
'634': '200200611'
'635': '200200663'
'636': '200200715'
'637': '200200719'
'638': '200200720'
'639': '200200723'
'640': '200200852'
'641': '200200939'
'642': '200201028'
'643': '200201029'
'644': '200201084'
'645': '200201087'
'646': '200201201'
'647': '200201329'
'648': '200201335'
'649': '200201339'
'650': '200201406'
'651': '200201524'
'652': '200201895'
'653': '200201906'
'654': '200201919'
'655': '200201926'
'656': '200201927'
'657': '200202092'
'658': '200202093'
'659': '200202362'
'660': '200202373'
'661': '200202476'
'662': '200202573'
'663': '200202617'
'664': '200202753'
'665': '200202755'
'666': '200202781'
'667': '200202782'
'668': '200202982'
'669': '200202994'
'670': '200203031'
'671': '200203496'
'672': '200203727'
'673': '200204308'
'674': '200204322'
'675': '200204348'
'676': '200204369'
'677': '200209794'
'678': '200210513'
'679': '200300089'
'680': '200300092'
'681': '200300095'
'682': '200300100'
'683': '200300102'
'684': '200300103'
'685': '200300104'
'686': '200300144'
'687': '200300154'
'688': '200300155'
'689': '200300192'
'690': '200300198'
'691': '200300260'
'692': '200300269'
'693': '200300283'
'694': '200300284'
'695': '200300286'
'696': '200300312'
'697': '200300313'
'698': '200300406'
'699': '200300408'
'700': '200300409'
'701': '200300423'
'702': '200300424'
'703': '200300431'
'704': '200300434'
'705': '200300458'
'706': '200300459'
'707': '200300461'
'708': '200300462'
'709': '200300463'
'710': '200300470'
'711': '200300520'
'712': '200300521'
'713': '200300523'
'714': '200300598'
'715': '200300603'
'716': '200300626'
'717': '200300638'
'718': '200300642'
'719': '200300645'
'720': '200300657'
'721': '200300660'
'722': '200300661'
'723': '200300694'
'724': '200300708'
'725': '200300784'
'726': '200300792'
'727': '200300796'
'728': '200300800'
'729': '200300801'
'730': '200300802'
'731': '200300804'
'732': '200300805'
'733': '200300806'
'734': '200300840'
'735': '200300843'
'736': '200300854'
'737': '200300855'
'738': '200300864'
'739': '200300878'
'740': '200300905'
'741': '200300913'
'742': '200300916'
'743': '200300917'
'744': '200300918'
'745': '200300984'
'746': '200300985'
'747': '200300987'
'748': '200301018'
'749': '200301019'
'750': '200301021'
'751': '200301031'
'752': '200301032'
'753': '200301033'
'754': '200301128'
'755': '200301131'
'756': '200301172'
'757': '200301181'
'758': '200301252'
'759': '200301323'
'760': '200301345'
'761': '200301348'
'762': '200301486'
'763': '200301507'
'764': '200301609'
'765': '200301610'
'766': '200301612'
'767': '200301618'
'768': '200301620'
'769': '200301624'
'770': '200301625'
'771': '200301667'
'772': '200301683'
'773': '200301739'
'774': '200301759'
'775': '200301795'
'776': '200301831'
'777': '200301937'
'778': '200301953'
'779': '200301956'
'780': '200301990'
'781': '200301991'
'782': '200302009'
'783': '200302015'
'784': '200302039'
'785': '200302049'
'786': '200302050'
'787': '200302051'
'788': '200302052'
'789': '200302053'
'790': '200302054'
'791': '200302073'
'792': '200302121'
'793': '200302122'
'794': '200302141'
'795': '200302187'
'796': '200302205'
'797': '200302214'
'798': '200302215'
'799': '200302254'
'800': '200302452'
'801': '200302456'
'802': '200302457'
'803': '200302476'
'804': '200302482'
'805': '200302484'
'806': '200302485'
'807': '200302486'
'808': '200302623'
'809': '200302627'
'810': '200302642'
'811': '200303280'
'812': '200303281'
'813': '200303328'
'814': '200307647'
'815': '200307781'
'816': '200307798'
'817': '200307799'
'818': '200307847'
'819': '200307848'
'820': '200307866'
'821': '200307867'
'822': '200308353'
'823': '200308354'
'824': '200308550'
'825': '200308552'
'826': '200308578'
'827': '200308603'
'828': '200308647'
'829': '200308774'
'830': '200308825'
'831': '200308841'
'832': '200308855'
'833': '200308867'
'834': '200400062'
'835': '200400100'
'836': '200400102'
'837': '200400104'
'838': '200400105'
'839': '200400111'
'840': '200400112'
'841': '200400113'
'842': '200400114'
'843': '200400118'
'844': '200400120'
'845': '200400121'
'846': '200400122'
'847': '200400123'
'848': '200400165'
'849': '200400166'
'850': '200400173'
'851': '200400175'
'852': '200400177'
'853': '200400179'
'854': '200400215'
'855': '200400323'
'856': '200400325'
'857': '200400387'
'858': '200400388'
'859': '200400392'
'860': '200400395'
'861': '200400397'
'862': '200400463'
'863': '200400465'
'864': '200400478'
'865': '200400482'
'866': '200400514'
'867': '200400519'
'868': '200400520'
'869': '200400528'
'870': '200400530'
'871': '200400538'
'872': '200400550'
'873': '200400561'
'874': '200400565'
'875': '200400570'
'876': '200400580'
'877': '200400581'
'878': '200400587'
'879': '200400644'
'880': '200400685'
'881': '200400686'
'882': '200400687'
'883': '200400691'
'884': '200400695'
'885': '200400700'
'886': '200400718'
'887': '200400724'
'888': '200400727'
'889': '200400728'
'890': '200400729'
'891': '200400798'
'892': '200400799'
'893': '200400801'
'894': '200400802'
'895': '200400803'
'896': '200400804'
'897': '200400813'
'898': '200400816'
'899': '200400817'
'900': '200400825'
'901': '200400828'
'902': '200400830'
'903': '200400841'
'904': '200400872'
'905': '200400894'
'906': '200400904'
'907': '200400910'
'908': '200400922'
'909': '200400930'
'910': '200400993'
'911': '200401005'
'912': '200401009'
'913': '200401062'
'914': '200401108'
'915': '200401172'
'916': '200401202'
'917': '200401210'
'918': '200401211'
'919': '200401219'
'920': '200401221'
'921': '200401222'
'922': '200401235'
'923': '200401242'
'924': '200401244'
'925': '200401247'
'926': '200401254'
'927': '200401323'
'928': '200401324'
'929': '200401325'
'930': '200401326'
'931': '200401327'
'932': '200401328'
'933': '200401329'
'934': '200401330'
'935': '200401332'
'936': '200401353'
'937': '200401359'
'938': '200401367'
'939': '200401369'
'940': '200401370'
'941': '200401371'
'942': '200401373'
'943': '200401383'
'944': '200401385'
'945': '200401397'
'946': '200401399'
'947': '200401401'
'948': '200401405'
'949': '200401407'
'950': '200401409'
'951': '200401411'
'952': '200401432'
'953': '200401547'
'954': '200401548'
'955': '200401578'
'956': '200401582'
'957': '200401586'
'958': '200401589'
'959': '200401651'
'960': '200401652'
'961': '200401666'
'962': '200401683'
'963': '200401684'
'964': '200401686'
'965': '200401687'
'966': '200401693'
'967': '200401718'
'968': '200401747'
'969': '200401750'
'970': '200401751'
'971': '200401755'
'972': '200401756'
'973': '200401762'
'974': '200401769'
'975': '200401848'
'976': '200401849'
'977': '200401850'
'978': '200401852'
'979': '200401857'
'980': '200401862'
'981': '200401874'
'982': '200401875'
'983': '200401876'
'984': '200401878'
'985': '200401881'
'986': '200401882'
'987': '200402344'
'988': '200402368'
'989': '200402374'
'990': '200402391'
'991': '200402393'
'992': '200402396'
'993': '200402399'
'994': '200402407'
'995': '200402408'
'996': '200402420'
'997': '200402435'
'998': '200402439'
'999': '200402455'
'1000': '200402458'
'1001': '200402464'
'1002': '200402465'
'1003': '200402466'
'1004': '200402468'
'1005': '200402485'
'1006': '200402487'
'1007': '200402490'
'1008': '200402508'
'1009': '200402510'
'1010': '200402522'
'1011': '200402557'
'1012': '200402604'
'1013': '200402617'
'1014': '200402623'
'1015': '200402674'
'1016': '200402677'
'1017': '200402681'
'1018': '200402690'
'1019': '200402816'
'1020': '200402819'
'1021': '200402823'
'1022': '200402824'
'1023': '200402825'
'1024': '200402830'
'1025': '200402834'
'1026': '200402835'
'1027': '200402839'
'1028': '200402840'
'1029': '200402841'
'1030': '200402842'
'1031': '200402845'
'1032': '200402847'
'1033': '200402848'
'1034': '200402852'
'1035': '200402853'
'1036': '200402879'
'1037': '200402902'
'1038': '200402903'
'1039': '200402912'
'1040': '200402913'
'1041': '200402915'
'1042': '200402927'
'1043': '200402928'
'1044': '200402929'
'1045': '200402931'
'1046': '200402939'
'1047': '200402943'
'1048': '200402944'
'1049': '200402945'
'1050': '200402946'
'1051': '200402947'
'1052': '200402949'
'1053': '200402952'
'1054': '200402968'
'1055': '200402973'
'1056': '200402979'
'1057': '200402981'
'1058': '200402982'
'1059': '200402983'
'1060': '200402992'
'1061': '200403040'
'1062': '200403041'
'1063': '200403042'
'1064': '200403049'
'1065': '200403052'
'1066': '200403053'
'1067': '200403054'
'1068': '200403056'
'1069': '200403060'
'1070': '200403062'
'1071': '200403075'
'1072': '200403081'
'1073': '200403082'
'1074': '200403084'
'1075': '200403085'
'1076': '200403086'
'1077': '200403089'
'1078': '200403090'
'1079': '200403104'
'1080': '200403111'
'1081': '200403121'
'1082': '200403125'
'1083': '200403126'
'1084': '200403137'
'1085': '200403138'
'1086': '200403160'
'1087': '200403167'
'1088': '200403183'
'1089': '200403184'
'1090': '200403185'
'1091': '200403186'
'1092': '200403226'
'1093': '200403289'
'1094': '200403290'
'1095': '200403291'
'1096': '200403292'
'1097': '200403296'
'1098': '200403304'
'1099': '200403305'
'1100': '200403309'
'1101': '200403312'
'1102': '200403317'
'1103': '200403319'
'1104': '200403321'
'1105': '200403323'
'1106': '200403325'
'1107': '200403327'
'1108': '200403329'
'1109': '200403333'
'1110': '200403335'
'1111': '200403336'
'1112': '200403405'
'1113': '200403411'
'1114': '200403413'
'1115': '200403414'
'1116': '200403415'
'1117': '200403419'
'1118': '200403425'
'1119': '200403426'
'1120': '200403427'
'1121': '200403433'
'1122': '200403434'
'1123': '200403448'
'1124': '200403459'
'1125': '200403460'
'1126': '200403461'
'1127': '200403462'
'1128': '200403463'
'1129': '200403464'
'1130': '200403465'
'1131': '200403783'
'1132': '200403794'
'1133': '200403814'
'1134': '200403816'
'1135': '200403819'
'1136': '200403822'
'1137': '200403982'
'1138': '200404000'
'1139': '200404007'
'1140': '200404009'
'1141': '200404029'
'1142': '200404034'
'1143': '200404038'
'1144': '200404039'
'1145': '200404040'
'1146': '200404042'
'1147': '200404043'
'1148': '200404044'
'1149': '200404046'
'1150': '200404048'
'1151': '200404050'
'1152': '200404051'
'1153': '200404055'
'1154': '200404059'
'1155': '200404060'
'1156': '200404168'
'1157': '200404421'
'1158': '200404437'
'1159': '200404438'
'1160': '200404644'
'1161': '200404646'
'1162': '200404647'
'1163': '200404653'
'1164': '200404654'
'1165': '200404656'
'1166': '200404657'
'1167': '200404669'
'1168': '200404670'
'1169': '200404671'
'1170': '200404677'
'1171': '200404678'
'1172': '200404684'
'1173': '200404690'
'1174': '200404692'
'1175': '200404693'
'1176': '200404695'
'1177': '200404698'
'1178': '200404699'
'1179': '200404700'
'1180': '200404701'
'1181': '200404702'
'1182': '200404707'
'1183': '200404709'
'1184': '200404711'
'1185': '200404712'
'1186': '200404719'
'1187': '200404721'
'1188': '200404722'
'1189': '200404724'
'1190': '200404726'
'1191': '200404740'
'1192': '200404741'
'1193': '200404747'
'1194': '200404754'
'1195': '200404762'
'1196': '200404763'
'1197': '200404771'
'1198': '200404772'
'1199': '200404779'
'1200': '200409929'
'1201': '200409930'
'1202': '200409932'
'1203': '200409933'
'1204': '200410082'
'1205': '200410083'
'1206': '200410085'
'1207': '200410086'
'1208': '200410087'
'1209': '200410088'
'1210': '200410089'
'1211': '200410090'
'1212': '200410337'
'1213': '200410865'
'1214': '200410892'
'1215': '200410894'
'1216': '200410896'
'1217': '200410897'
'1218': '200410901'
'1219': '200410902'
'1220': '200410905'
'1221': '200410909'
'1222': '200410941'
'1223': '200410942'
'1224': '200410943'
'1225': '200410951'
'1226': '200410977'
'1227': '200410999'
'1228': '200411002'
'1229': '200411033'
'1230': '200411042'
'1231': '200411043'
'1232': '200411062'
'1233': '200411064'
'1234': '200411065'
'1235': '200411066'
'1236': '200411067'
'1237': '200411068'
'1238': '200411087'
'1239': '200411095'
'1240': '200411207'
'1241': '200411208'
'1242': '200500111'
'1243': '200500125'
'1244': '200500128'
'1245': '200500130'
'1246': '200500132'
'1247': '200500133'
'1248': '200500134'
'1249': '200500135'
'1250': '200500140'
'1251': '200500144'
'1252': '200500145'
'1253': '200500146'
'1254': '200500150'
'1255': '200500161'
'1256': '200500162'
'1257': '200500246'
'1258': '200500248'
'1259': '200500249'
'1260': '200500251'
'1261': '200500252'
'1262': '200500253'
'1263': '200500254'
'1264': '200500255'
'1265': '200500257'
'1266': '200500258'
'1267': '200500287'
'1268': '200500288'
'1269': '200500302'
'1270': '200500424'
'1271': '200500426'
'1272': '200500430'
'1273': '200500474'
'1274': '200500476'
'1275': '200500482'
'1276': '200500483'
'1277': '200500545'
'1278': '200500552'
'1279': '200500553'
'1280': '200500554'
'1281': '200500581'
'1282': '200500583'
'1283': '200500585'
'1284': '200500589'
'1285': '200500592'
'1286': '200500605'
'1287': '200500622'
'1288': '200500649'
'1289': '200500686'
'1290': '200500694'
'1291': '200500695'
'1292': '200500699'
'1293': '200500704'
'1294': '200500797'
'1295': '200500800'
'1296': '200500805'
'1297': '200500806'
'1298': '200500809'
'1299': '200500815'
'1300': '200500835'
'1301': '200500853'
'1302': '200500854'
'1303': '200500864'
'1304': '200500870'
'1305': '200500871'
'1306': '200500882'
'1307': '200500892'
'1308': '200500961'
'1309': '200500962'
'1310': '200500963'
'1311': '200500965'
'1312': '200500969'
'1313': '200501012'
'1314': '200501107'
'1315': '200501119'
'1316': '200501120'
'1317': '200501141'
'1318': '200501145'
'1319': '200501155'
'1320': '200501163'
'1321': '200501237'
'1322': '200501239'
'1323': '200501241'
'1324': '200501243'
'1325': '200501265'
'1326': '200501267'
'1327': '200501277'
'1328': '200501288'
'1329': '200501291'
'1330': '200501293'
'1331': '200501301'
'1332': '200501307'
'1333': '200501384'
'1334': '200501427'
'1335': '200501429'
'1336': '200501458'
'1337': '200501472'
'1338': '200501511'
'1339': '200501522'
'1340': '200501593'
'1341': '200501601'
'1342': '200501605'
'1343': '200501618'
'1344': '200501667'
'1345': '200501979'
'1346': '200501997'
'1347': '200501999'
'1348': '200502010'
'1349': '200502012'
'1350': '200502014'
'1351': '200502017'
'1352': '200502029'
'1353': '200502030'
'1354': '200502035'
'1355': '200502039'
'1356': '200502050'
'1357': '200502052'
'1358': '200502054'
'1359': '200502086'
'1360': '200502087'
'1361': '200502089'
'1362': '200502090'
'1363': '200502105'
'1364': '200502107'
'1365': '200502109'
'1366': '200502175'
'1367': '200502192'
'1368': '200502218'
'1369': '200502219'
'1370': '200502306'
'1371': '200502416'
'1372': '200502470'
'1373': '200502472'
'1374': '200502473'
'1375': '200502474'
'1376': '200502490'
'1377': '200502557'
'1378': '200502588'
'1379': '200502598'
'1380': '200502600'
'1381': '200502601'
'1382': '200502633'
'1383': '200502711'
'1384': '200502717'
'1385': '200502742'
'1386': '200502744'
'1387': '200502757'
'1388': '200502762'
'1389': '200502770'
'1390': '200502808'
'1391': '200502809'
'1392': '200502810'
'1393': '200502834'
'1394': '200502838'
'1395': '200502901'
'1396': '200502953'
'1397': '200503063'
'1398': '200503065'
'1399': '200503066'
'1400': '200503075'
'1401': '200503128'
'1402': '200503200'
'1403': '200503234'
'1404': '200503238'
'1405': '200503246'
'1406': '200503619'
'1407': '200503637'
'1408': '200503642'
'1409': '200503652'
'1410': '200503653'
'1411': '200503664'
'1412': '200503775'
'1413': '200503783'
'1414': '200503786'
'1415': '200503787'
'1416': '200503788'
'1417': '200503798'
'1418': '200503799'
'1419': '200503803'
'1420': '200503806'
'1421': '200503807'
'1422': '200505665'
'1423': '200511055'
'1424': '200511056'
'1425': '200511057'
'1426': '200511058'
'1427': '200511059'
'1428': '200511060'
'1429': '200511064'
'1430': '200511065'
'1431': '200511066'
'1432': '200511067'
'1433': '200511068'
'1434': '200511106'
'1435': '200511107'
'1436': '200511108'
'1437': '200511262'
'1438': '200511263'
'1439': '200511264'
'1440': '200511266'
'1441': '200511904'
'1442': '200512186'
'1443': '200512201'
'1444': '200512238'
'1445': '200512240'
'1446': '200512244'
'1447': '200600026'
'1448': '200600027'
'1449': '200600028'
'1450': '200600036'
'1451': '200600049'
'1452': '200600054'
'1453': '200600081'
'1454': '200600114'
'1455': '200600116'
'1456': '200600127'
'1457': '200600128'
'1458': '200600134'
'1459': '200600138'
'1460': '200600156'
'1461': '200600159'
'1462': '200600160'
'1463': '200600178'
'1464': '200600181'
'1465': '200600221'
'1466': '200600222'
'1467': '200600225'
'1468': '200600248'
'1469': '200600249'
'1470': '200600267'
'1471': '200600287'
'1472': '200600325'
'1473': '200600352'
'1474': '200600367'
'1475': '200600456'
'1476': '200600460'
'1477': '200600522'
'1478': '200600527'
'1479': '200600543'
'1480': '200600562'
'1481': '200600665'
'1482': '200600674'
'1483': '200600688'
'1484': '200600696'
'1485': '200600697'
'1486': '200600698'
'1487': '200600811'
'1488': '200600818'
'1489': '200600819'
'1490': '200600821'
'1491': '200603815'
'1492': '200603871'
'1493': '200603872'
'1494': '200604151'
'1495': '200604193'
'1496': '200604212'
'1497': '200604227'
'1498': '200604239'
'1499': '200604243'
'1500': '200604258'
'1501': '200604293'
'1502': '200604308'
'1503': '200604312'
'1504': '200604332'
'1505': '200604334'
'1506': '200604358'
'1507': '200604408'
'1508': '200605253'
'1509': '200605344'
'1510': '200605364'
'1511': '200605381'
'1512': '200605398'
'1513': '200605403'
'1514': '200605410'
'1515': '200605427'
'1516': '200605439'
'1517': '200605445'
'1518': '200605447'
'1519': '200605448'
'1520': '200605450'
'1521': '200605468'
'1522': '200605481'
'1523': '200605499'
'1524': '200605537'
'1525': '200606180'
'1526': '200606181'
'1527': '200606182'
'1528': '200606183'
'1529': '200606294'
'1530': '200606297'
'1531': '200606303'
'1532': '200606308'
'1533': '200606318'
'1534': '200606326'
'1535': '200606351'
'1536': '200606537'
'1537': '200606825'
'1538': '200606830'
'1539': '200606835'
'1540': '200606965'
'1541': '200606986'
'1542': '200607004'
'1543': '200607445'
'1544': '200607486'
'1545': '200607677'
'1546': '200607709'
'1547': '200607743'
'1548': '200607747'
'1549': '200607750'
'1550': '200607776'
'1551': '200607777'
'1552': '200607778'
'1553': '200607784'
'1554': '200607785'
'1555': '200607847'
'1556': '200607874'
'1557': '200607876'
'1558': '200607890'
'1559': '200607907'
'1560': '200607908'
'1561': '200608189'
'1562': '200608192'
'1563': '200608219'
'1564': '200608232'
'1565': '200608240'
'1566': '200608275'
'1567': '200608278'
'1568': '200608280'
'1569': '200608291'
'1570': '200608295'
'1571': '200608296'
'1572': '200610660'
'1573': '200610661'
'1574': '200610785'
'1575': '200610879'
'1576': '200610960'
'1577': '200610978'
'1578': '200610980'
'1579': '200610986'
'1580': '200611006'
'1581': '200611007'
'1582': '200611010'
'1583': '200611019'
'1584': '200611031'
'1585': '200611039'
'1586': '200611041'
'1587': '200611042'
'1588': '200611053'
'1589': '200611114'
'1590': '200611303'
'1591': '200611304'
'1592': '200611332'
'1593': '200611338'
'1594': '200611385'
'1595': '200611392'
'1596': '200612115'
'1597': '200612573'
'1598': '200612728'
'1599': '200612743'
'1600': '200700055'
'1601': '200700234'
'1602': '200700435'
'1603': '200700437'
'1604': '200700455'
'1605': '200700540'
'1606': '200700562'
'1607': '200700790'
'1608': '200700814'
'1609': '200700835'
'1610': '200700836'
'1611': '200700844'
'1612': '200700876'
'1613': '200700890'
'1614': '200701020'
'1615': '200701024'
'1616': '200701042'
'1617': '200701201'
'1618': '200701231'
'1619': '200701233'
'1620': '200701235'
'1621': '200701236'
'1622': '200701246'
'1623': '200701251'
'1624': '200701335'
'1625': '200701403'
'1626': '200701405'
'1627': '200701406'
'1628': '200701412'
'1629': '200701642'
'1630': '200701809'
'1631': '200701876'
'1632': '200701917'
'1633': '200701932'
'1634': '200701934'
'1635': '200702089'
'1636': '200702091'
'1637': '200702101'
'1638': '200702108'
'1639': '200702135'
'1640': '200702153'
'1641': '200702164'
'1642': '200702166'
'1643': '200702172'
'1644': '200702173'
'1645': '200702191'
'1646': '200702195'
'1647': '200702200'
'1648': '200702250'
'1649': '200702255'
'1650': '200702260'
'1651': '200702309'
'1652': '200702313'
'1653': '200702358'
'1654': '200702389'
'1655': '200702420'
'1656': '200702508'
'1657': '200702521'
'1658': '200702532'
'1659': '200702536'
'1660': '200702538'
'1661': '200702549'
'1662': '200702572'
'1663': '200702577'
'1664': '200702584'
'1665': '200702691'
'1666': '200702709'
'1667': '200702739'
'1668': '200702749'
'1669': '200702963'
'1670': '200702966'
'1671': '200702967'
'1672': '200702969'
'1673': '200702970'
'1674': '200703052'
'1675': '200703075'
'1676': '200703120'
'1677': '200703121'
'1678': '200703129'
'1679': '200703132'
'1680': '200703142'
'1681': '200703160'
'1682': '200703161'
'1683': '200703163'
'1684': '200703169'
'1685': '200703171'
'1686': '200703172'
'1687': '200703173'
'1688': '200703185'
'1689': '200703201'
'1690': '200703229'
'1691': '200703233'
'1692': '200703236'
'1693': '200703261'
'1694': '200703278'
'1695': '200703394'
'1696': '200703429'
'1697': '200703452'
'1698': '200703458'
'1699': '200703459'
'1700': '200703462'
'1701': '200703463'
'1702': '200703464'
'1703': '200703663'
'1704': '200703675'
'1705': '200703700'
'1706': '200703701'
'1707': '200703715'
'1708': '200703733'
'1709': '200703737'
'1710': '200703797'
'1711': '200703801'
'1712': '200703835'
'1713': '200703842'
'1714': '200703852'
'1715': '200703873'
'1716': '200703876'
'1717': '200703877'
'1718': '200703964'
'1719': '200704019'
'1720': '200704285'
'1721': '200704316'
'1722': '200704320'
'1723': '200704322'
'1724': '200704420'
'1725': '200704501'
'1726': '200704534'
'1727': '200704541'
'1728': '200704587'
'1729': '200704612'
'1730': '200704614'
'1731': '200704765'
'1732': '200704813'
'1733': '200704961'
'1734': '200704969'
'1735': '200705001'
'1736': '200705314'
'1737': '200705392'
'1738': '200705402'
'1739': '200705500'
'1740': '200705587'
'1741': '200705588'
'1742': '200705595'
'1743': '200705596'
'1744': '200705817'
'1745': '200706007'
'1746': '200706008'
'1747': '200706029'
'1748': '200706030'
'1749': '200706059'
'1750': '200706185'
'1751': '200706243'
'1752': '200706330'
'1753': '200706331'
'1754': '200706332'
'1755': '200706333'
'1756': '200706341'
'1757': '200706400'
'1758': '200706467'
'1759': '200706537'
'1760': '200706542'
'1761': '200706569'
'1762': '200706597'
'1763': '200706766'
'1764': '200706788'
'1765': '200706796'
'1766': '200706908'
'1767': '200706941'
'1768': '200706984'
'1769': '200707142'
'1770': '200707175'
'1771': '200707251'
'1772': '200707373'
'1773': '200707382'
'1774': '200707383'
'1775': '200707385'
'1776': '200707388'
'1777': '200707390'
'1778': '200707521'
'1779': '200707723'
'1780': '200707792'
'1781': '200707920'
'1782': '200707944'
'1783': '200707947'
'1784': '200707958'
'1785': '200708055'
'1786': '200708144'
'1787': '200708364'
'1788': '200708367'
'1789': '200708368'
'1790': '200708372'
'1791': '200708376'
'1792': '200708427'
'1793': '200708447'
'1794': '200708455'
'1795': '200708456'
'1796': '200708458'
'1797': '200708534'
'1798': '200708585'
'1799': '200708589'
'1800': '200708592'
'1801': '200708595'
'1802': '200708596'
'1803': '200708597'
'1804': '200708599'
'1805': '200708600'
'1806': '200708601'
'1807': '200708754'
'1808': '200708786'
'1809': '200708816'
'1810': '200708819'
'1811': '200708961'
'1812': '200709004'
'1813': '200709010'
'1814': '200709163'
'1815': '200709201'
'1816': '200709202'
'1817': '200709249'
'1818': '200709331'
'1819': '200709389'
'1820': '200709390'
'1821': '200709448'
'1822': '200709527'
'1823': '200709544'
'1824': '200709638'
'1825': '200709640'
'1826': '200709641'
'1827': '200709794'
'1828': '200709818'
'1829': '200709901'
'1830': '200709902'
'1831': '200709913'
'1832': '200709915'
'1833': '200710087'
'1834': '200710374'
'1835': '200710524'
'1836': '200710628'
'1837': '200710758'
'1838': '200710759'
'1839': '200710760'
'1840': '200710808'
'1841': '200710855'
'1842': '200710856'
'1843': '200710957'
'1844': '200710998'
'1845': '200711148'
'1846': '200711216'
'1847': '200711286'
'1848': '200711309'
'1849': '200711422'
'1850': '200711426'
'1851': '200711456'
'1852': '200711527'
'1853': '200711589'
'1854': '200711634'
'1855': '200711885'
'1856': '200711886'
'1857': '200711927'
'1858': '200711934'
'1859': '200711953'
'1860': '200712070'
'1861': '200712241'
'1862': '200712325'
'1863': '200712401'
'1864': '200712587'
'1865': '200712588'
'1866': '200712647'
'1867': '200712648'
'1868': '200712742'
'1869': '200712789'
'1870': '200712868'
'1871': '200712885'
'1872': '200713004'
'1873': '200713008'
'1874': '200713009'
'1875': '200713032'
'1876': '200713079'
'1877': '200713170'
'1878': '200713296'
'1879': '200713517'
'1880': '200713591'
'1881': '200713685'
'1882': '200713688'
'1883': '200713885'
'1884': '200713886'
'1885': '200713906'
'1886': '200713912'
'1887': '200800284'
'1888': '200800291'
'1889': '200800316'
'1890': '200800320'
'1891': '200800330'
'1892': '200800331'
'1893': '200800339'
'1894': '200800347'
'1895': '200800405'
'1896': '200800411'
'1897': '200800569'
'1898': '200800614'
'1899': '200801141'
'1900': '200801284'
'1901': '200801332'
'1902': '200801333'
'1903': '200801350'
'1904': '200801352'
'1905': '200801383'
'1906': '200801460'
'1907': '200801517'
'1908': '200801564'
'1909': '200801686'
'1910': '200801763'
'1911': '200801812'
'1912': '200801814'
'1913': '200801822'
'1914': '200801836'
'1915': '200802022'
'1916': '200802064'
'1917': '200802066'
'1918': '200802182'
'1919': '200802325'
'1920': '200802327'
'1921': '200802392'
'1922': '200802547'
'1923': '200802557'
'1924': '200802558'
'1925': '200802788'
'1926': '200802853'
'1927': '200802965'
'1928': '200803000'
'1929': '200803148'
'1930': '200803200'
'1931': '200803246'
'1932': '200803290'
'1933': '200803292'
'1934': '200803294'
'1935': '200803417'
'1936': '200803428'
'1937': '200803434'
'1938': '200803641'
'1939': '200803693'
'1940': '200803701'
'1941': '200803737'
'1942': '200803803'
'1943': '200803972'
'1944': '200804092'
'1945': '200804225'
'1946': '200804412'
'1947': '200804519'
'1948': '200804556'
'1949': '200804557'
'1950': '200804560'
'1951': '200804563'
'1952': '200804586'
'1953': '200804611'
'1954': '200804633'
'1955': '200804796'
'1956': '200804798'
'1957': '200804802'
'1958': '200804810'
'1959': '200804912'
'1960': '200804947'
'1961': '200804956'
'1962': '200805037'
'1963': '200805064'
'1964': '200805085'
'1965': '200805090'
'1966': '200805113'
'1967': '200805124'
'1968': '200805152'
'1969': '200805214'
'1970': '200805215'
'1971': '200805262'
'1972': '200805275'
'1973': '200805290'
'1974': '200805301'
'1975': '200805341'
'1976': '200805353'
'1977': '200805387'
'1978': '200805408'
'1979': '200805418'
'1980': '200805421'
'1981': '200805455'
'1982': '200805495'
'1983': '200805550'
'1984': '200805595'
'1985': '200805598'
'1986': '200805599'
'1987': '200805632'
'1988': '200805676'
'1989': '200805680'
'1990': '200805734'
'1991': '200805735'
'1992': '200805737'
'1993': '200805739'
'1994': '200805740'
'1995': '200805748'
'1996': '200805749'
'1997': '200805752'
'1998': '200805810'
'1999': '200805812'
'2000': '200805818'
'2001': '200805819'
'2002': '200805821'
'2003': '200805842'
'2004': '200805849'
'2005': '200805895'
'2006': '200805901'
'2007': '200805915'
'2008': '200805918'
'2009': '200805919'
'2010': '200805944'
'2011': '200805986'
'2012': '200805990'
'2013': '200805991'
'2014': '200806022'
'2015': '200806024'
'2016': '200806190'
'2017': '200806191'
'2018': '200806192'
'2019': '200806194'
'2020': '200806299'
'2021': '200806361'
'2022': '200806365'
'2023': '200806368'
'2024': '200806373'
'2025': '200806455'
'2026': '200806634'
'2027': '200806668'
'2028': '200806736'
'2029': '200806769'
'2030': '200806842'
'2031': '200806896'
'2032': '200806977'
'2033': '200807044'
'2034': '200807051'
'2035': '200807066'
'2036': '200807067'
'2037': '200807068'
'2038': '200807203'
'2039': '200807204'
'2040': '200807205'
'2041': '200807208'
'2042': '200807210'
'2043': '200807211'
'2044': '200807223'
'2045': '200807229'
'2046': '200807230'
'2047': '200807253'
'2048': '200807310'
'2049': '200807361'
'2050': '200807363'
'2051': '200807365'
'2052': '200807367'
'2053': '200807374'
'2054': '200807377'
'2055': '200807391'
'2056': '200807418'
'2057': '200807430'
'2058': '200807431'
'2059': '200807469'
'2060': '200807526'
'2061': '200807527'
'2062': '200807528'
'2063': '200807539'
'2064': '200807552'
'2065': '200807582'
'2066': '200807617'
'2067': '200807618'
'2068': '200807633'
'2069': '200807683'
'2070': '200807684'
'2071': '200807685'
'2072': '200807813'
'2073': '200807814'
'2074': '200807945'
'2075': '200807949'
'2076': '200807985'
'2077': '200807986'
'2078': '200807989'
'2079': '200808076'
'2080': '200808078'
'2081': '200808091'
'2082': '200808126'
'2083': '200808127'
'2084': '200808195'
'2085': '200808219'
'2086': '200808244'
'2087': '200808245'
'2088': '200808272'
'2089': '200808273'
'2090': '200808309'
'2091': '200808362'
'2092': '200808377'
'2093': '200808511'
'2094': '200808562'
'2095': '200808715'
'2096': '200808774'
'2097': '200808790'
'2098': '200808828'
'2099': '200808862'
'2100': '200808985'
'2101': '200809045'
'2102': '200809058'
'2103': '200809076'
'2104': '200809079'
'2105': '200809084'
'2106': '200809185'
'2107': '200809207'
'2108': '200809208'
'2109': '200809240'
'2110': '200809258'
'2111': '200809388'
'2112': '200809402'
'2113': '200809403'
'2114': '200809464'
'2115': '200809649'
'2116': '200809791'
'2117': '200809813'
'2118': '200809818'
'2119': '200809819'
'2120': '200809825'
'2121': '200809902'
'2122': '200809909'
'2123': '200809910'
'2124': '200810003'
'2125': '200810004'
'2126': '200810005'
'2127': '200810006'
'2128': '200810031'
'2129': '200810113'
'2130': '200810114'
'2131': '200810125'
'2132': '200810237'
'2133': '200810314'
'2134': '200810320'
'2135': '200810340'
'2136': '200810424'
'2137': '200810425'
'2138': '200810427'
'2139': '200810440'
'2140': '200810513'
'2141': '200810517'
'2142': '200810584'
'2143': '200810700'
'2144': '200810702'
'2145': '200810703'
'2146': '200810727'
'2147': '200810760'
'2148': '200810870'
'2149': '200810940'
'2150': '200810973'
'2151': '200810981'
'2152': '200811250'
'2153': '200811251'
'2154': '200811253'
'2155': '200811262'
'2156': '200811364'
'2157': '200811406'
'2158': '200811564'
'2159': '200811608'
'2160': '200811623'
'2161': '200811629'
'2162': '200811701'
'2163': '200811736'
'2164': '200811814'
'2165': '200811885'
'2166': '200900194'
'2167': '200900196'
'2168': '200900341'
'2169': '200900606'
'2170': '200900668'
'2171': '200900671'
'2172': '200900674'
'2173': '200900767'
'2174': '200900878'
'2175': '200900889'
'2176': '200900892'
'2177': '200900895'
'2178': '200900897'
'2179': '200900899'
'2180': '200900967'
'2181': '200900997'
'2182': '200901007'
'2183': '200901073'
'2184': '200901207'
'2185': '200901232'
'2186': '200901456'
'2187': '200901507'
'2188': '200901536'
'2189': '200901544'
'2190': '200901619'
'2191': '200901817'
'2192': '200902042'
'2193': '200902080'
'2194': '200902207'
'2195': '200902213'
'2196': '200902281'
'2197': '200902301'
'2198': '200902443'
'2199': '200902507'
'2200': '200902537'
'2201': '200903041'
'2202': '200903043'
'2203': '200903115'
'2204': '200903125'
'2205': '200903409'
'2206': '200903782'
'2207': '200903801'
'2208': '200903974'
'2209': '200904162'
'2210': '200904163'
'2211': '200904234'
'2212': '200904308'
'2213': '200904325'
'2214': '200904514'
'2215': '200904538'
'2216': '200904548'
'2217': '200904649'
'2218': '200904678'
'2219': '200904689'
'2220': '200904855'
'2221': '200904880'
'2222': '200904881'
'2223': '200904990'
'2224': '200905004'
'2225': '200905017'
'2226': '200905053'
'2227': '200905055'
'2228': '200905151'
'2229': '200905163'
'2230': '200905196'
'2231': '200905218'
'2232': '200905315'
'2233': '200905403'
'2234': '200905653'
'2235': '200905876'
'2236': '200905938'
'2237': '200905939'
'2238': '200906073'
'2239': '200906330'
'2240': '200906384'
'2241': '200906557'
'2242': '200906562'
'2243': '200906578'
'2244': '200906599'
'2245': '200906602'
'2246': '200906966'
'2247': '200907093'
'2248': '200907385'
'2249': '200907446'
'2250': '200907725'
'2251': '200908116'
'2252': '200908185'
'2253': '200908186'
'2254': '200908333'
'2255': '200908369'
'2256': '200908478'
'2257': '200908479'
'2258': '200908781'
'2259': '200908817'
'2260': '200908819'
'2261': '200908825'
'2262': '200908953'
'2263': '200909377'
'2264': '201000300'
'2265': '201000406'
'2266': '201000637'
'2267': '201000674'
'2268': '201000826'
'2269': '201000869'
'2270': '201000874'
'2271': '201000875'
'2272': '201000877'
'2273': '201000878'
'2274': '201000951'
'2275': '201000963'
'2276': '201000967'
'2277': '201000972'
'2278': '201000974'
'2279': '201000975'
'2280': '201000977'
'2281': '201000980'
'2282': '201000981'
'2283': '201000982'
'2284': '201000985'
'2285': '201000989'
'2286': '201000993'
'2287': '201000994'
'2288': '201000998'
'2289': '201000999'
'2290': '201001001'
'2291': '201001003'
'2292': '201001004'
'2293': '201001005'
'2294': '201001006'
'2295': '201001007'
'2296': '201001008'
'2297': '201001013'
'2298': '201001017'
'2299': '201001018'
'2300': '201001021'
'2301': '201001022'
'2302': '201001023'
'2303': '201001024'
'2304': '201001057'
'2305': '201001424'
'2306': '201001547'
'2307': '201001640'
'2308': '201001656'
'2309': '201001682'
'2310': '201001683'
'2311': '201001685'
'2312': '201001699'
'2313': '201001707'
'2314': '201001723'
'2315': '201001731'
'2316': '201001744'
'2317': '201002151'
'2318': '201002277'
'2319': '201002395'
'2320': '201002396'
'2321': '201002402'
'2322': '201002449'
'2323': '201002450'
'2324': '201002503'
'2325': '201003043'
'2326': '201003049'
'2327': '201003203'
'2328': '201003206'
'2329': '201003207'
'2330': '201003213'
'2331': '201003370'
'2332': '201003397'
'2333': '201003398'
'2334': '201003713'
'2335': '201004170'
'2336': '201004204'
'2337': '201004230'
'2338': '201004257'
'2339': '201004440'
'2340': '201004538'
'2341': '201004680'
'2342': '201004693'
'2343': '201004700'
'2344': '201004997'
'2345': '201005059'
'2346': '201005070'
'2347': '201005121'
'2348': '201005235'
'2349': '201005236'
'2350': '201005685'
'2351': '201005699'
'2352': '201005763'
'2353': '201005770'
'2354': '201005771'
'2355': '201005774'
'2356': '201005775'
'2357': '201005841'
'2358': '201006075'
'2359': '201006077'
'2360': '201006120'
'2361': '201006205'
'2362': '201006206'
'2363': '201006211'
'2364': '201006442'
'2365': '201006443'
'2366': '201006495'
'2367': '201006592'
'2368': '201006784'
'2369': '201006870'
'2370': '201006871'
'2371': '201007233'
'2372': '201007234'
'2373': '201007276'
'2374': '201007277'
'2375': '201007281'
'2376': '201007401'
'2377': '201007460'
'2378': '201007499'
'2379': '201007523'
'2380': '201007529'
'2381': '201007569'
'2382': '201007652'
'2383': '201007672'
'2384': '201007708'
'2385': '201007750'
'2386': '201007774'
'2387': '201007782'
'2388': '201007783'
'2389': '201100026'
'2390': '201100231'
'2391': '201100273'
'2392': '201100299'
'2393': '201100375'
'2394': '201100467'
'2395': '201100491'
'2396': '201100500'
'2397': '201100502'
'2398': '201100572'
'2399': '201100792'
'2400': '201100793'
'2401': '201100794'
'2402': '201100795'
'2403': '201100797'
'2404': '201100798'
'2405': '201100799'
'2406': '201100800'
'2407': '201100801'
'2408': '201100802'
'2409': '201100821'
'2410': '201100826'
'2411': '201100827'
'2412': '201100828'
'2413': '201100866'
'2414': '201100867'
'2415': '201100869'
'2416': '201100888'
'2417': '201100889'
'2418': '201100894'
'2419': '201100898'
'2420': '201100900'
'2421': '201100901'
'2422': '201100902'
'2423': '201100904'
'2424': '201100905'
'2425': '201100906'
'2426': '201100908'
'2427': '201100911'
'2428': '201100913'
'2429': '201100945'
'2430': '201100946'
'2431': '201101004'
'2432': '201101014'
'2433': '201101022'
'2434': '201101038'
'2435': '201101640'
'2436': '201101643'
'2437': '201101644'
'2438': '201101756'
'2439': '201101761'
'2440': '201101840'
'2441': '201101843'
'2442': '201101844'
'2443': '201101845'
'2444': '201101846'
'2445': '201101847'
'2446': '201101857'
'2447': '201101860'
'2448': '201101861'
'2449': '201101884'
'2450': '201101885'
'2451': '201102450'
'2452': '201102476'
'2453': '201102477'
'2454': '201102757'
'2455': '201102767'
'2456': '201102768'
'2457': '201102769'
'2458': '201102784'
'2459': '201103159'
'2460': '201103161'
'2461': '201103324'
'2462': '201103422'
'2463': '201103424'
'2464': '201103425'
'2465': '201103468'
'2466': '201103469'
'2467': '201103472'
'2468': '201103475'
'2469': '201103563'
'2470': '201103609'
'2471': '201103615'
'2472': '201103617'
'2473': '201103909'
'2474': '201103986'
'2475': '201104026'
'2476': '201104041'
'2477': '201104062'
'2478': '201104063'
'2479': '201104066'
'2480': '201104111'
'2481': '201104471'
'2482': '201104754'
'2483': '201104755'
'2484': '201104801'
'2485': '201105016'
'2486': '201105106'
'2487': '201105143'
'2488': '201105144'
'2489': '201105270'
'2490': '201105412'
'2491': '201105413'
'2492': '201105484'
'2493': '201105487'
'2494': '201105488'
'2495': '201105600'
'2496': '201105697'
'2497': '201105756'
'2498': '201105914'
'2499': '201105915'
'2500': '201105918'
'2501': '201106145'
'2502': '201106366'
'2503': '201106390'
'2504': '201106781'
'2505': '201106782'
'2506': '201106947'
'2507': '201106949'
'2508': '201106952'
'2509': '201106953'
'2510': '201107002'
'2511': '201107176'
'2512': '201107199'
'2513': '201107201'
'2514': '201107219'
'2515': '201107220'
'2516': '201107224'
'2517': '201107229'
'2518': '201107233'
'2519': '201107234'
'2520': '201107272'
'2521': '201107273'
'2522': '201107290'
'2523': '201107503'
'2524': '201107505'
'2525': '201107507'
'2526': '201107509'
'2527': '201107512'
'2528': '201107513'
'2529': '201107515'
'2530': '201107516'
'2531': '201107525'
'2532': '201107526'
'2533': '201107530'
'2534': '201107536'
'2535': '201107539'
'2536': '201107542'
'2537': '201107543'
'2538': '201108989'
'2539': '201108990'
'2540': '201109024'
'2541': '201109030'
'2542': '201109142'
'2543': '201109150'
'2544': '201109151'
'2545': '201109154'
'2546': '201109155'
'2547': '201109222'
'2548': '201109223'
'2549': '201109224'
'2550': '201109225'
'2551': '201109227'
'2552': '201109228'
'2553': '201109231'
'2554': '201109232'
'2555': '201109259'
'2556': '201109260'
'2557': '201109276'
'2558': '201109277'
'2559': '201109291'
'2560': '201109296'
'2561': '201109309'
'2562': '201109310'
'2563': '201109313'
'2564': '201109442'
'2565': '201109514'
'2566': '201109733'
'2567': '201109818'
'2568': '201110023'
'2569': '201110095'
'2570': '201110136'
'2571': '201110192'
'2572': '201110292'
'2573': '201110398'
'2574': '201110512'
'2575': '201110516'
'2576': '201110646'
'2577': '201111149'
'2578': '201112194'
'2579': '201112195'
'2580': '201200003'
'2581': '201200004'
'2582': '201200009'
'2583': '201200186'
'2584': '201200204'
'2585': '201200246'
'2586': '201200277'
'2587': '201200586'
'2588': '201200603'
'2589': '201200635'
'2590': '201200636'
'2591': '201200678'
'2592': '201200717'
'2593': '201200854'
'2594': '201200930'
'2595': '201200975'
'2596': '201201166'
'2597': '201201261'
'2598': '201201265'
'2599': '201201601'
'2600': '201201602'
'2601': '201201680'
'2602': '201201690'
'2603': '201201693'
'2604': '201201694'
'2605': '201201837'
'2606': '201202398'
'2607': '201202780'
'2608': '201202801'
'2609': '201202902'
'2610': '201203127'
'2611': '201203128'
'2612': '201203513'
'2613': '201203514'
'2614': '201203607'
'2615': '201203608'
'2616': '201203744'
'2617': '201203745'
'2618': '201203928'
'2619': '201204330'
'2620': '201204389'
'2621': '201204555'
'2622': '201204813'
'2623': '201204970'
'2624': '201204971'
'2625': '201204975'
'2626': '201205043'
'2627': '201205150'
'2628': '201205250'
'2629': '201205252'
'2630': '201205253'
'2631': '201205255'
'2632': '201205261'
'2633': '201205262'
'2634': '201205263'
'2635': '201205266'
'2636': '201205414'
'2637': '201205415'
'2638': '201205623'
'2639': '201205628'
'2640': '201205653'
'2641': '201205654'
'2642': '201205825'
'2643': '201205826'
'2644': '201205827'
'2645': '201205828'
'2646': '201205829'
'2647': '201205833'
'2648': '201205930'
'2649': '201205939'
'2650': '201205942'
'2651': '201206020'
'2652': '201206056'
'2653': '201206160'
'2654': '201206161'
'2655': '201206162'
'2656': '201206163'
'2657': '201206166'
'2658': '201206176'
'2659': '201206201'
'2660': '201206205'
'2661': '201206250'
'2662': '201206297'
'2663': '201206327'
'2664': '201206328'
'2665': '201206338'
'2666': '201206403'
'2667': '201206410'
'2668': '201206435'
'2669': '201206489'
'2670': '201206493'
'2671': '201206494'
'2672': '201206499'
'2673': '201206540'
'2674': '201206552'
'2675': '201206553'
'2676': '201206575'
'2677': '201206612'
'2678': '201206613'
'2679': '201206667'
'2680': '201206685'
'2681': '201206704'
'2682': '201206741'
'2683': '201206751'
'2684': '201206760'
'2685': '201206761'
'2686': '201206776'
'2687': '201206804'
'2688': '201206805'
'2689': '201206831'
'2690': '201206844'
'2691': '201206846'
'2692': '201206856'
'2693': '201206857'
'2694': '201206858'
'2695': '201206870'
'2696': '201206871'
'2697': '201206872'
'2698': '201206877'
'2699': '201206904'
'2700': '201206905'
'2701': '201206916'
'2702': '201206934'
'2703': '201206935'
'2704': '201206936'
'2705': '201206953'
'2706': '201206954'
'2707': '201207002'
'2708': '201207003'
'2709': '201207004'
'2710': '201207009'
'2711': '201207014'
'2712': '201207015'
'2713': '201207061'
'2714': '201207067'
'2715': '201207105'
'2716': '201207217'
'2717': '201207236'
'2718': '201207292'
'2719': '201207294'
'2720': '201207300'
'2721': '201207301'
'2722': '201207402'
'2723': '201207403'
'2724': '201207404'
'2725': '201207405'
'2726': '201207417'
'2727': '201207453'
'2728': '201207459'
'2729': '201207461'
'2730': '201207469'
'2731': '201207486'
'2732': '201207488'
'2733': '201207513'
'2734': '201207516'
'2735': '201207517'
'2736': '201207537'
'2737': '201207538'
'2738': '201207539'
'2739': '201207540'
'2740': '201207541'
'2741': '201207575'
'2742': '201207577'
'2743': '201207586'
'2744': '201207589'
'2745': '201207592'
'2746': '201207593'
'2747': '201207595'
'2748': '201207596'
'2749': '201207597'
'2750': '201207688'
'2751': '201207721'
'2752': '201207722'
'2753': '201208078'
'2754': '201208080'
'2755': '201208081'
'2756': '201208110'
'2757': '201208193'
'2758': '201208256'
'2759': '201208462'
'2760': '201208467'
'2761': '201208480'
'2762': '201208482'
'2763': '201208483'
'2764': '201208484'
'2765': '201208518'
'2766': '201208629'
'2767': '201208630'
'2768': '201208632'
'2769': '201208643'
'2770': '201208653'
'2771': '201208654'
'2772': '201208655'
'2773': '201208723'
'2774': '201208740'
'2775': '201208753'
'2776': '201208754'
'2777': '201208855'
'2778': '201208875'
'2779': '201209444'
'2780': '201209757'
'2781': '201210163'
'2782': '201210185'
'2783': '201210187'
'2784': '201210235'
'2785': '201210240'
'2786': '201210744'
'2787': '201211055'
'2788': '201211103'
'2789': '201211564'
'2790': '201300114'
'2791': '201300128'
'2792': '201300139'
'2793': '201300148'
'2794': '201300152'
'2795': '201301236'
'2796': '201301369'
'2797': '201301377'
'2798': '201301498'
'2799': '201301586'
'2800': '201301591'
'2801': '201301621'
'2802': '201301656'
'2803': '201302165'
'2804': '201302169'
'2805': '201302260'
'2806': '201302390'
'2807': '201302618'
'2808': '201302619'
'2809': '201302802'
'2810': '201302936'
'2811': '201302948'
'2812': '201302980'
'2813': '201302982'
'2814': '201302990'
'2815': '201303029'
'2816': '201303030'
'2817': '201303035'
'2818': '201303045'
'2819': '201303269'
'2820': '201304289'
'2821': '201304386'
'2822': '201305008'
'2823': '201305030'
'2824': '201305371'
'2825': '201305403'
'2826': '201305533'
'2827': '201305813'
'2828': '201305827'
'2829': '201305832'
'2830': '201306036'
'2831': '201306176'
'2832': '201306203'
'2833': '201306230'
'2834': '201306262'
'2835': '201306264'
'2836': '201306277'
'2837': '201306289'
'2838': '201306315'
'2839': '201306327'
'2840': '201306328'
'2841': '201306329'
'2842': '201306340'
'2843': '201306344'
'2844': '201306352'
'2845': '201306362'
'2846': '201306363'
'2847': '201306366'
'2848': '201306367'
'2849': '201306368'
'2850': '201306369'
'2851': '201306370'
'2852': '201306376'
'2853': '201306377'
'2854': '201306379'
'2855': '201306383'
'2856': '201306385'
'2857': '201306386'
'2858': '201306404'
'2859': '201306407'
'2860': '201306412'
'2861': '201306419'
'2862': '201306425'
'2863': '201306452'
'2864': '201306453'
'2865': '201306454'
'2866': '201306455'
'2867': '201306549'
'2868': '201306824'
'2869': '201306917'
'2870': '201306922'
'2871': '201306942'
'2872': '201306954'
'2873': '201306969'
'2874': '201306990'
'2875': '201307038'
'2876': '201307042'
'2877': '201307062'
'2878': '201307079'
'2879': '201307092'
'2880': '201307108'
'2881': '201307118'
'2882': '201307120'
'2883': '201307121'
'2884': '201307123'
'2885': '201307124'
'2886': '201307131'
'2887': '201307132'
'2888': '201307149'
'2889': '201307158'
'2890': '201307209'
'2891': '201307243'
'2892': '201307261'
'2893': '201307290'
'2894': '201307342'
'2895': '201307357'
'2896': '201307377'
'2897': '201307379'
'2898': '201307416'
'2899': '201307449'
'2900': '201307476'
'2901': '201307484'
'2902': '201307514'
'2903': '201307515'
'2904': '201307539'
'2905': '201307558'
'2906': '201307559'
'2907': '201307562'
'2908': '201307567'
'2909': '201307570'
'2910': '201307607'
'2911': '201307655'
'2912': '201307682'
'2913': '201307687'
'2914': '201307725'
'2915': '201307726'
'2916': '201307767'
'2917': '201307807'
'2918': '201307881'
'2919': '201307882'
'2920': '201307888'
'2921': '201307896'
'2922': '201307904'
'2923': '201307925'
'2924': '201307926'
'2925': '201307927'
'2926': '201307936'
'2927': '201307937'
'2928': '201307947'
'2929': '201307988'
'2930': '201308005'
'2931': '201308053'
'2932': '201308062'
'2933': '201308067'
'2934': '201308099'
'2935': '201308195'
'2936': '201308197'
'2937': '201308209'
'2938': '201308223'
'2939': '201308236'
'2940': '201308242'
'2941': '201308278'
'2942': '201308287'
'2943': '201308288'
'2944': '201308289'
'2945': '201308295'
'2946': '201308303'
'2947': '201308368'
'2948': '201308371'
'2949': '201308398'
'2950': '201308399'
'2951': '201308416'
'2952': '201308434'
'2953': '201308448'
'2954': '201308450'
'2955': '201308451'
'2956': '201308457'
'2957': '201308458'
'2958': '201308459'
'2959': '201308460'
'2960': '201308461'
'2961': '201308462'
'2962': '201308463'
'2963': '201308464'
'2964': '201308466'
'2965': '201308467'
'2966': '201308468'
'2967': '201308469'
'2968': '201308470'
'2969': '201308476'
'2970': '201308478'
'2971': '201308479'
'2972': '201308480'
'2973': '201308481'
'2974': '201308483'
'2975': '201308485'
'2976': '201308488'
'2977': '201308489'
'2978': '201308490'
'2979': '201308494'
'2980': '201308495'
'2981': '201308496'
'2982': '201308500'
'2983': '201308502'
'2984': '201308504'
'2985': '201308506'
'2986': '201308509'
'2987': '201308524'
'2988': '201308525'
'2989': '201308526'
'2990': '201308537'
'2991': '201308538'
'2992': '201308539'
'2993': '201308540'
'2994': '201308541'
'2995': '201308550'
'2996': '201308552'
'2997': '201308553'
'2998': '201308564'
'2999': '201308566'
'3000': '201308567'
'3001': '201308573'
'3002': '201308574'
'3003': '201308589'
'3004': '201308613'
'3005': '201308626'
'3006': '201308651'
'3007': '201308652'
'3008': '201308671'
'3009': '201308712'
'3010': '201308718'
'3011': '201308824'
'3012': '201308859'
'3013': '201308895'
'3014': '201308906'
'3015': '201308923'
'3016': '201308924'
'3017': '201308926'
'3018': '201308928'
'3019': '201308943'
'3020': '201308977'
'3021': '201309019'
'3022': '201309044'
'3023': '201309067'
'3024': '201309068'
'3025': '201309070'
'3026': '201309071'
'3027': '201309088'
'3028': '201309092'
'3029': '201309126'
'3030': '201309133'
'3031': '201309139'
'3032': '201309145'
'3033': '201309146'
'3034': '201309164'
'3035': '201309176'
'3036': '201309180'
'3037': '201309184'
'3038': '201309266'
'3039': '201309269'
'3040': '201309336'
'3041': '201309360'
'3042': '201309384'
'3043': '201309416'
'3044': '201309417'
'3045': '201309418'
'3046': '201309425'
'3047': '201309444'
'3048': '201309449'
'3049': '201309460'
'3050': '201309480'
'3051': '201309481'
'3052': '201309484'
'3053': '201309488'
'3054': '201309494'
'3055': '201309539'
'3056': '201309566'
'3057': '201309576'
'3058': '201309599'
'3059': '201309682'
'3060': '201309684'
'3061': '201309738'
'3062': '201309741'
'3063': '201309743'
'3064': '201309744'
'3065': '201309809'
'3066': '201309814'
'3067': '201309848'
'3068': '201309850'
'3069': '201309883'
'3070': '201309888'
'3071': '201309895'
'3072': '201309896'
'3073': '201309897'
'3074': '201309926'
'3075': '201309959'
'3076': '201309963'
'3077': '201309964'
'3078': '201309973'
'3079': '201309994'
'3080': '201310000'
'3081': '201310013'
'3082': '201310206'
'3083': '201310293'
'3084': '201310300'
'3085': '201310318'
'3086': '201310334'
'3087': '201310348'
'3088': '201310352'
'3089': '201310361'
'3090': '201310362'
'3091': '201310380'
'3092': '201310381'
'3093': '201310382'
'3094': '201310387'
'3095': '201310393'
'3096': '201310399'
'3097': '201310402'
'3098': '201310409'
'3099': '201310587'
'3100': '201310588'
'3101': '201310589'
'3102': '201310590'
'3103': '201310619'
'3104': '201310620'
'3105': '201310720'
'3106': '201310742'
'3107': '201310756'
'3108': '201310764'
'3109': '201310777'
'3110': '201310780'
'3111': '201310781'
'3112': '201310782'
'3113': '201310783'
'3114': '201310785'
'3115': '201310797'
'3116': '201310798'
'3117': '201310820'
'3118': '201310826'
'3119': '201310830'
'3120': '201310841'
'3121': '201310842'
'3122': '201310869'
'3123': '201310895'
'3124': '201310896'
'3125': '201310897'
'3126': '201310931'
'3127': '201310935'
'3128': '201310964'
'3129': '201310977'
'3130': '201310978'
'3131': '201310991'
'3132': '201400024'
'3133': '201400053'
'3134': '201400099'
'3135': '201400118'
'3136': '201400120'
'3137': '201400193'
'3138': '201400200'
'3139': '201400262'
'3140': '201400286'
'3141': '201400288'
'3142': '201400289'
'3143': '201400290'
'3144': '201400293'
'3145': '201400337'
'3146': '201400352'
'3147': '201400366'
'3148': '201400367'
'3149': '201400372'
'3150': '201400381'
'3151': '201400382'
'3152': '201400385'
'3153': '201400386'
'3154': '201400387'
'3155': '201400392'
'3156': '201400393'
'3157': '201400409'
'3158': '201400410'
'3159': '201400422'
'3160': '201400426'
'3161': '201400429'
'3162': '201400433'
'3163': '201400434'
'3164': '201400435'
'3165': '201400438'
'3166': '201400451'
'3167': '201400452'
'3168': '201400493'
'3169': '201400497'
'3170': '201400537'
'3171': '201400538'
'3172': '201400681'
'3173': '201400706'
'3174': '201400757'
'3175': '201400771'
'3176': '201400785'
'3177': '201400804'
'3178': '201400828'
'3179': '201400834'
'3180': '201400842'
'3181': '201400873'
'3182': '201400918'
'3183': '201400931'
'3184': '201400932'
'3185': '201400933'
'3186': '201400980'
'3187': '201401034'
'3188': '201401057'
'3189': '201401098'
'3190': '201401129'
'3191': '201401226'
'3192': '201401227'
'3193': '201401235'
'3194': '201401334'
'3195': '201401343'
'3196': '201401344'
'3197': '201401372'
'3198': '201401376'
'3199': '201401417'
'3200': '201401456'
'3201': '201401475'
'3202': '201401487'
'3203': '201401489'
'3204': '201401490'
'3205': '201401518'
'3206': '201401569'
'3207': '201401591'
'3208': '201401625'
'3209': '201401626'
'3210': '201401730'
'3211': '201401871'
'3212': '201401904'
'3213': '201401921'
'3214': '201401923'
'3215': '201401937'
'3216': '201401965'
'3217': '201401966'
'3218': '201401984'
'3219': '201401985'
'3220': '201401986'
'3221': '201401996'
'3222': '201402039'
'3223': '201402056'
'3224': '201402065'
'3225': '201402070'
'3226': '201402071'
'3227': '201402074'
'3228': '201402137'
'3229': '201402148'
'3230': '201402159'
'3231': '201402185'
'3232': '201402206'
'3233': '201402245'
'3234': '201402272'
'3235': '201402276'
'3236': '201402309'
'3237': '201402414'
'3238': '201402416'
'3239': '201402427'
'3240': '201402435'
'3241': '201402436'
'3242': '201402450'
'3243': '201402463'
'3244': '201402464'
'3245': '201402560'
'3246': '201402579'
'3247': '201402641'
'3248': '201402642'
'3249': '201402652'
'3250': '201402682'
'3251': '201402707'
'3252': '201402711'
'3253': '201402714'
'3254': '201402733'
'3255': '201402734'
'3256': '201402804'
'3257': '201402806'
'3258': '201402822'
'3259': '201402828'
'3260': '201402829'
'3261': '201402850'
'3262': '201402856'
'3263': '201402864'
'3264': '201402865'
'3265': '201402866'
'3266': '201402868'
'3267': '201402869'
'3268': '201402875'
'3269': '201402880'
'3270': '201402890'
'3271': '201402909'
'3272': '201402912'
'3273': '201402913'
'3274': '201402936'
'3275': '201402997'
'3276': '201403012'
'3277': '201403023'
'3278': '201403026'
'3279': '201403029'
'3280': '201403031'
'3281': '201403051'
'3282': '201403070'
'3283': '201403088'
'3284': '201403118'
'3285': '201403126'
'3286': '201403127'
'3287': '201403146'
'3288': '201403148'
'3289': '201403152'
'3290': '201403184'
'3291': '201403185'
'3292': '201403217'
'3293': '201403243'
'3294': '201403253'
'3295': '201403298'
'3296': '201403332'
'3297': '201403337'
'3298': '201403380'
'3299': '201403382'
'3300': '201403388'
'3301': '201403389'
'3302': '201403391'
'3303': '201403398'
'3304': '201403407'
'3305': '201403408'
'3306': '201403417'
'3307': '201403429'
'3308': '201403455'
'3309': '201403456'
'3310': '201403457'
'3311': '201403497'
'3312': '201403501'
'3313': '201403518'
'3314': '201403519'
'3315': '201403533'
'3316': '201403537'
'3317': '201403543'
'3318': '201403550'
'3319': '201403559'
'3320': '201403560'
'3321': '201403567'
'3322': '201403570'
'3323': '201403571'
'3324': '201403572'
'3325': '201403597'
'3326': '201403598'
'3327': '201403617'
'3328': '201403627'
'3329': '201403650'
'3330': '201403692'
'3331': '201403693'
'3332': '201403694'
'3333': '201403697'
'3334': '201403698'
'3335': '201403726'
'3336': '201403732'
'3337': '201403734'
'3338': '201403739'
'3339': '201403797'
'3340': '201403816'
'3341': '201403819'
'3342': '201403820'
'3343': '201403826'
'3344': '201403833'
'3345': '201403870'
'3346': '201403877'
'3347': '201403880'
'3348': '201403884'
'3349': '201403886'
'3350': '201403889'
'3351': '201403891'
'3352': '201403892'
'3353': '201403894'
'3354': '201403919'
'3355': '201403930'
'3356': '201403939'
'3357': '201403943'
'3358': '201403944'
'3359': '201403945'
'3360': '201403947'
'3361': '201403952'
'3362': '201403955'
'3363': '201403957'
'3364': '201403967'
'3365': '201403975'
'3366': '201403983'
'3367': '201404005'
'3368': '201404006'
'3369': '201404008'
'3370': '201404009'
'3371': '201404010'
'3372': '201404013'
'3373': '201404014'
'3374': '201404015'
'3375': '201404016'
'3376': '201404026'
'3377': '201404035'
'3378': '201404036'
'3379': '201404038'
'3380': '201404042'
'3381': '201404043'
'3382': '201404088'
'3383': '201404093'
'3384': '201404114'
'3385': '201404115'
'3386': '201404116'
'3387': '201404136'
'3388': '201404162'
'3389': '201404163'
'3390': '201404168'
'3391': '201404170'
'3392': '201404198'
'3393': '201404200'
'3394': '201404206'
'3395': '201404215'
'3396': '201404239'
'3397': '201404271'
'3398': '201404287'
'3399': '201404293'
'3400': '201404299'
'3401': '201404316'
'3402': '201404327'
'3403': '201404340'
'3404': '201404342'
'3405': '201404343'
'3406': '201404377'
'3407': '201404400'
'3408': '201404403'
'3409': '201404406'
'3410': '201404415'
'3411': '201404417'
'3412': '201404424'
'3413': '201404425'
'3414': '201404426'
'3415': '201404441'
'3416': '201404454'
'3417': '201404456'
'3418': '201404458'
'3419': '201404483'
'3420': '201404486'
'3421': '201404487'
'3422': '201404488'
'3423': '201404497'
'3424': '201404513'
'3425': '201404527'
'3426': '201404528'
'3427': '201404537'
'3428': '201404538'
'3429': '201404539'
'3430': '201404541'
'3431': '201404566'
'3432': '201404569'
'3433': '201404590'
'3434': '201404591'
'3435': '201404592'
'3436': '201404593'
'3437': '201404595'
'3438': '201404596'
'3439': '201404597'
'3440': '201404633'
'3441': '201404634'
'3442': '201404644'
'3443': '201404654'
'3444': '201404664'
'3445': '201404677'
'3446': '201404678'
'3447': '201404689'
'3448': '201404690'
'3449': '201404696'
'3450': '201404714'
'3451': '201404715'
'3452': '201404716'
'3453': '201404731'
'3454': '201404732'
'3455': '201404733'
'3456': '201404741'
'3457': '201404786'
'3458': '201404800'
'3459': '201404801'
'3460': '201404824'
'3461': '201404836'
'3462': '201404857'
'3463': '201404863'
'3464': '201404889'
'3465': '201404928'
'3466': '201404929'
'3467': '201404930'
'3468': '201404976'
'3469': '201404981'
'3470': '201404983'
'3471': '201404986'
'3472': '201404993'
'3473': '201404994'
'3474': '201404995'
'3475': '201405000'
'3476': '201405003'
'3477': '201405010'
'3478': '201405014'
'3479': '201405026'
'3480': '201405027'
'3481': '201405033'
'3482': '201405042'
'3483': '201405043'
'3484': '201405044'
'3485': '201405045'
'3486': '201405046'
'3487': '201405050'
'3488': '201405051'
'3489': '201405052'
'3490': '201405063'
'3491': '201405073'
'3492': '201405078'
'3493': '201405079'
'3494': '201405080'
'3495': '201405094'
'3496': '201405096'
'3497': '201405100'
'3498': '201405103'
'3499': '201405104'
'3500': '201405107'
'3501': '201405111'
'3502': '201405112'
'3503': '201405113'
'3504': '201405114'
'3505': '201405130'
'3506': '201405131'
'3507': '201405132'
'3508': '201405175'
'3509': '201405177'
'3510': '201405232'
'3511': '201405233'
'3512': '201405262'
'3513': '201405282'
'3514': '201405290'
'3515': '201405347'
'3516': '201405387'
'3517': '201405424'
'3518': '201405425'
'3519': '201405434'
'3520': '201405436'
'3521': '201405439'
'3522': '201405449'
'3523': '201405450'
'3524': '201405454'
'3525': '201405464'
'3526': '201405467'
'3527': '201405468'
'3528': '201405469'
'3529': '201405482'
'3530': '201405502'
'3531': '201405503'
'3532': '201405504'
'3533': '201405519'
'3534': '201405520'
'3535': '201405531'
'3536': '201405585'
'3537': '201405586'
'3538': '201405598'
'3539': '201405610'
'3540': '201405615'
'3541': '201405693'
'3542': '201405708'
'3543': '201405813'
'3544': '201405839'
'3545': '201405840'
'3546': '201405841'
'3547': '201405844'
'3548': '201405857'
'3549': '201405865'
'3550': '201405882'
'3551': '201405921'
'3552': '201405930'
'3553': '201405956'
'3554': '201405971'
'3555': '201405995'
'3556': '201406030'
'3557': '201406032'
'3558': '201406033'
'3559': '201406046'
'3560': '201406102'
'3561': '201406125'
'3562': '201406126'
'3563': '201406140'
'3564': '201406156'
'3565': '201406161'
'3566': '201406170'
'3567': '201406171'
'3568': '201406179'
'3569': '201406315'
'3570': '201500025'
'3571': '201500081'
'3572': '201500082'
'3573': '201500083'
'3574': '201500099'
'3575': '201500101'
'3576': '201500107'
'3577': '201500114'
'3578': '201500127'
'3579': '201500145'
'3580': '201500149'
'3581': '201500168'
'3582': '201500169'
'3583': '201500171'
'3584': '201500179'
'3585': '201500233'
'3586': '201500293'
'3587': '201500313'
'3588': '201500364'
'3589': '201500365'
'3590': '201500366'
'3591': '201500367'
'3592': '201500372'
'3593': '201500387'
'3594': '201500401'
'3595': '201500429'
'3596': '201500450'
'3597': '201500458'
'3598': '201500483'
'3599': '201500509'
'3600': '201500516'
'3601': '201500517'
'3602': '201500521'
'3603': '201500522'
'3604': '201500530'
'3605': '201500554'
'3606': '201500645'
'3607': '201500650'
'3608': '201500694'
'3609': '201500722'
'3610': '201500725'
'3611': '201500728'
'3612': '201500731'
'3613': '201500852'
'3614': '201500856'
'3615': '201500902'
'3616': '201500903'
'3617': '201500929'
'3618': '201500948'
'3619': '201500959'
'3620': '201500963'
'3621': '201500964'
'3622': '201500965'
'3623': '201500966'
'3624': '201500967'
'3625': '201500969'
'3626': '201500972'
'3627': '201500982'
'3628': '201501074'
'3629': '201501083'
'3630': '201501128'
'3631': '201501207'
'3632': '201501216'
'3633': '201501270'
'3634': '201501274'
'3635': '201501307'
'3636': '201501322'
'3637': '201501325'
'3638': '201501350'
'3639': '201501441'
'3640': '201501446'
'3641': '201501476'
'3642': '201501488'
'3643': '201501494'
'3644': '201501500'
'3645': '201501501'
'3646': '201501503'
'3647': '201501539'
'3648': '201501540'
'3649': '201501548'
'3650': '201501550'
'3651': '201501555'
'3652': '201501556'
'3653': '201501557'
'3654': '201501569'
'3655': '201501592'
'3656': '201501595'
'3657': '201501605'
'3658': '201501608'
'3659': '201501609'
'3660': '201501610'
'3661': '201501611'
'3662': '201501612'
'3663': '201501622'
'3664': '201501624'
'3665': '201501644'
'3666': '201501647'
'3667': '201501657'
'3668': '201501658'
'3669': '201501663'
'3670': '201501675'
'3671': '201501676'
'3672': '201501677'
'3673': '201501680'
'3674': '201501687'
'3675': '201501689'
'3676': '201501690'
'3677': '201501693'
'3678': '201501698'
'3679': '201501702'
'3680': '201501703'
'3681': '201501714'
'3682': '201501723'
'3683': '201501724'
'3684': '201501725'
'3685': '201501730'
'3686': '201501733'
'3687': '201501741'
'3688': '201501744'
'3689': '201501778'
'3690': '201501783'
'3691': '201501790'
'3692': '201501791'
'3693': '201501811'
'3694': '201501814'
'3695': '201501853'
'3696': '201501854'
'3697': '201501865'
'3698': '201501874'
'3699': '201501885'
'3700': '201501891'
'3701': '201501925'
'3702': '201501928'
'3703': '201501929'
'3704': '201501931'
'3705': '201501944'
'3706': '201501965'
'3707': '201501978'
'3708': '201501979'
'3709': '201501980'
'3710': '201502217'
'3711': '201502267'
'3712': '201502291'
'3713': '201502292'
'3714': '201502307'
'3715': '201502310'
'3716': '201502330'
'3717': '201502345'
'3718': '201502363'
'3719': '201502371'
'3720': '201502379'
'3721': '201502394'
'3722': '201502397'
'3723': '201502417'
'3724': '201502419'
'3725': '201502420'
'3726': '201502421'
'3727': '201502450'
'3728': '201502453'
'3729': '201502454'
'3730': '201502471'
'3731': '201502552'
'3732': '201502560'
'3733': '201502562'
'3734': '201502564'
'3735': '201502570'
'3736': '201502606'
'3737': '201502607'
'3738': '201502608'
'3739': '201502609'
'3740': '201502631'
'3741': '201502638'
'3742': '201502639'
'3743': '201502640'
'3744': '201502644'
'3745': '201502645'
'3746': '201502654'
'3747': '201502656'
'3748': '201502658'
'3749': '201502659'
'3750': '201502670'
'3751': '201502671'
'3752': '201502677'
'3753': '201502678'
'3754': '201502689'
'3755': '201502690'
'3756': '201502703'
'3757': '201502704'
'3758': '201502705'
'3759': '201502708'
'3760': '201502709'
'3761': '201502710'
'3762': '201502712'
'3763': '201502714'
'3764': '201502717'
'3765': '201502718'
'3766': '201502727'
'3767': '201502742'
'3768': '201502761'
'3769': '201502766'
'3770': '201502770'
'3771': '201502773'
'3772': '201502774'
'3773': '201502775'
'3774': '201502785'
'3775': '201502786'
'3776': '201502790'
'3777': '201502793'
'3778': '201502813'
'3779': '201502862'
'3780': '201502874'
'3781': '201502890'
'3782': '201502917'
'3783': '201502924'
'3784': '201502926'
'3785': '201502952'
'3786': '201502969'
'3787': '201502979'
'3788': '201502986'
'3789': '201502987'
'3790': '201502994'
'3791': '201503006'
'3792': '201503025'
'3793': '201503026'
'3794': '201503027'
'3795': '201503029'
'3796': '201503051'
'3797': '201503142'
'3798': '201503144'
'3799': '201503191'
'3800': '201503193'
'3801': '201503212'
'3802': '201503218'
'3803': '201503219'
'3804': '201503234'
'3805': '201503249'
'3806': '201503251'
'3807': '201503253'
'3808': '201503288'
'3809': '201503289'
'3810': '201503290'
'3811': '201503293'
'3812': '201503295'
'3813': '201503302'
'3814': '201503335'
'3815': '201503336'
'3816': '201503468'
'3817': '201503475'
'3818': '201503482'
'3819': '201503488'
'3820': '201503490'
'3821': '201503506'
'3822': '201503516'
'3823': '201503520'
'3824': '201503532'
'3825': '201503554'
'3826': '201503575'
'3827': '201503600'
'3828': '201503641'
'3829': '201503645'
'3830': '201503647'
'3831': '201503683'
'3832': '201503771'
'3833': '201503772'
'3834': '201503783'
'3835': '201503839'
'3836': '201503865'
'3837': '201503876'
'3838': '201503920'
'3839': '201503921'
'3840': '201503948'
'3841': '201504046'
'3842': '201504047'
'3843': '201504070'
'3844': '201504073'
'3845': '201504084'
'3846': '201504106'
'3847': '201504115'
'3848': '201504119'
'3849': '201504124'
'3850': '201504135'
'3851': '201504137'
'3852': '201504179'
'3853': '201504199'
'3854': '201504202'
'3855': '201504203'
'3856': '201504205'
'3857': '201504225'
'3858': '201504231'
'3859': '201504249'
'3860': '201504255'
'3861': '201504259'
'3862': '201504303'
'3863': '201504313'
'3864': '201504314'
'3865': '201504315'
'3866': '201504317'
'3867': '201504318'
'3868': '201504319'
'3869': '201504320'
'3870': '201504321'
'3871': '201504326'
'3872': '201504332'
'3873': '201504333'
'3874': '201504334'
'3875': '201504493'
'3876': '201504494'
'3877': '201504495'
'3878': '201504519'
'3879': '201504729'
'3880': '201504732'
'3881': '201504763'
'3882': '201504788'
'3883': '201504988'
'3884': '201505072'
'3885': '201505074'
'3886': '201505091'
'3887': '201505114'
'3888': '201505137'
'3889': '201505162'
'3890': '201505163'
'3891': '201505164'
'3892': '201505167'
'3893': '201505168'
'3894': '201505169'
'3895': '201505192'
'3896': '201505198'
'3897': '201505199'
'3898': '201505259'
'3899': '201505342'
'3900': '201505350'
'3901': '201505365'
'3902': '201505441'
'3903': '201505522'
'3904': '201505607'
'3905': '201505621'
'3906': '201505725'
'3907': '201505737'
'3908': '201505796'
'3909': '201505797'
'3910': '201505830'
'3911': '201505842'
'3912': '201505843'
'3913': '201505901'
'3914': '201505903'
'3915': '201505904'
'3916': '201505920'
'3917': '201505921'
'3918': '201505935'
'3919': '201506120'
'3920': '201506121'
'3921': '201506122'
'3922': '201506136'
'3923': '201506143'
'3924': '201506288'
'3925': '201506398'
'3926': '201506402'
'3927': '201506405'
'3928': '201506481'
'3929': '201506489'
'3930': '201506507'
'3931': '201506569'
'3932': '201506583'
'3933': '201506589'
'3934': '201506623'
'3935': '201506624'
'3936': '201506639'
'3937': '201506649'
'3938': '201506677'
'3939': '201506684'
'3940': '201506685'
'3941': '201506686'
'3942': '201506729'
'3943': '201506734'
'3944': '201506748'
'3945': '201506766'
'3946': '201506769'
'3947': '201506783'
'3948': '201506785'
'3949': '201506786'
'3950': '201506795'
'3951': '201506797'
'3952': '201506799'
'3953': '201506815'
'3954': '201506816'
'3955': '201506835'
'3956': '201506843'
'3957': '201506848'
'3958': '201506851'
'3959': '201506856'
'3960': '201506857'
'3961': '201506873'
'3962': '201506874'
'3963': '201506894'
'3964': '201507014'
'3965': '201507045'
'3966': '201507071'
'3967': '201507072'
'3968': '201507074'
'3969': '201507075'
'3970': '201507134'
'3971': '201507139'
'3972': '201507146'
'3973': '201507195'
'3974': '201507196'
'3975': '201507261'
'3976': '201507270'
'3977': '201507272'
'3978': '201507273'
'3979': '201507288'
'3980': '201507291'
'3981': '201507380'
'3982': '201507381'
'3983': '201507390'
'3984': '201507399'
'3985': '201507405'
'3986': '201507411'
'3987': '201507412'
'3988': '201507413'
'3989': '201507418'
'3990': '201507420'
'3991': '201507424'
'3992': '201507426'
'3993': '201507452'
'3994': '201507455'
'3995': '201507458'
'3996': '201507464'
'3997': '201507465'
'3998': '201507470'
'3999': '201507477'
'4000': '201507478'
'4001': '201507501'
'4002': '201507502'
'4003': '201507517'
'4004': '201507521'
'4005': '201507527'
'4006': '201507582'
'4007': '201507644'
'4008': '201507645'
'4009': '201507681'
'4010': '201507682'
'4011': '201507715'
'4012': '201507736'
'4013': '201507737'
'4014': '201507738'
'4015': '201507762'
'4016': '201507763'
'4017': '201507764'
'4018': '201507776'
'4019': '201507778'
'4020': '201507785'
'4021': '201507786'
'4022': '201507806'
'4023': '201507839'
'4024': '201507840'
'4025': '201507875'
'4026': '201507876'
'4027': '201507887'
'4028': '201507888'
'4029': '201507891'
'4030': '201507895'
'4031': '201507896'
'4032': '201507897'
'4033': '201507898'
'4034': '201507899'
'4035': '201507944'
'4036': '201507948'
'4037': '201507964'
'4038': '201507965'
'4039': '201507966'
'4040': '201507967'
'4041': '201507968'
'4042': '201507984'
'4043': '201507985'
'4044': '201507986'
'4045': '201507997'
'4046': '201507998'
'4047': '201507999'
'4048': '201508003'
'4049': '201508004'
'4050': '201508018'
'4051': '201508019'
'4052': '201508116'
'4053': '201508117'
'4054': '201508139'
'4055': '201508213'
'4056': '201508214'
'4057': '201508231'
'4058': '201508232'
'4059': '201508272'
'4060': '201508281'
'4061': '201508325'
'4062': '201508372'
'4063': '201508378'
'4064': '201508403'
'4065': '201508407'
'4066': '201508411'
'4067': '201508428'
'4068': '201508435'
'4069': '201508436'
'4070': '201508440'
'4071': '201508441'
'4072': '201508453'
'4073': '201508461'
'4074': '201508476'
'4075': '201508477'
'4076': '201508486'
'4077': '201508519'
'4078': '201508531'
'4079': '201508532'
'4080': '201508542'
'4081': '201600113'
'4082': '201600170'
'4083': '201600171'
'4084': '201600172'
'4085': '201600191'
'4086': '201600217'
'4087': '201600251'
'4088': '201600274'
'4089': '201600275'
'4090': '201600300'
'4091': '201600301'
'4092': '201600339'
'4093': '201600340'
'4094': '201600356'
'4095': '201600358'
'4096': '201600359'
'4097': '201600361'
'4098': '201600391'
'4099': '201600396'
'4100': '201600407'
'4101': '201600425'
'4102': '201600426'
'4103': '201600436'
'4104': '201600445'
'4105': '201600498'
'4106': '201600500'
'4107': '201600505'
'4108': '201600507'
'4109': '201600515'
'4110': '201600516'
'4111': '201600519'
'4112': '201600568'
'4113': '201600569'
'4114': '201600572'
'4115': '201600573'
'4116': '201600574'
'4117': '201600584'
'4118': '201600585'
'4119': '201600601'
'4120': '201600695'
'4121': '201600696'
'4122': '201600731'
'4123': '201600777'
'4124': '201600782'
'4125': '201600828'
'4126': '201600839'
'4127': '201600853'
'4128': '201600903'
'4129': '201600933'
'4130': '201600938'
'4131': '201600941'
'4132': '201600974'
'4133': '201600978'
'4134': '201600986'
'4135': '201601004'
'4136': '201601006'
'4137': '201601009'
'4138': '201601030'
'4139': '201601045'
'4140': '201601046'
'4141': '201601047'
'4142': '201601050'
'4143': '201601053'
'4144': '201601071'
'4145': '201601081'
'4146': '201601082'
'4147': '201601112'
'4148': '201601113'
'4149': '201601117'
'4150': '201601119'
'4151': '201601149'
'4152': '201601191'
'4153': '201601195'
'4154': '201601196'
'4155': '201601197'
'4156': '201601201'
'4157': '201601202'
'4158': '201601203'
'4159': '201601219'
'4160': '201601220'
'4161': '201601221'
'4162': '201601231'
'4163': '201601232'
'4164': '201601233'
'4165': '201601234'
'4166': '201601249'
'4167': '201601347'
'4168': '201601506'
'4169': '201601522'
'4170': '201601537'
'4171': '201601552'
'4172': '201601553'
'4173': '201601554'
'4174': '201601555'
'4175': '201601643'
'4176': '201601680'
'4177': '201601683'
'4178': '201601700'
'4179': '201601702'
'4180': '201601703'
'4181': '201601721'
'4182': '201601725'
'4183': '201601728'
'4184': '201601731'
'4185': '201601734'
'4186': '201601737'
'4187': '201601742'
'4188': '201601743'
'4189': '201601761'
'4190': '201601762'
'4191': '201601896'
'4192': '201602068'
'4193': '201602080'
'4194': '201602081'
'4195': '201602085'
'4196': '201602122'
'4197': '201602235'
'4198': '201602239'
'4199': '201602274'
'4200': '201602286'
'4201': '201602294'
'4202': '201602295'
'4203': '201602296'
'4204': '201602307'
'4205': '201602314'
'4206': '201602328'
'4207': '201602339'
'4208': '201602349'
'4209': '201602351'
'4210': '201602362'
'4211': '201602383'
'4212': '201602409'
'4213': '201602421'
'4214': '201602427'
'4215': '201602432'
'4216': '201602491'
'4217': '201602500'
'4218': '201602513'
'4219': '201602518'
'4220': '201602533'
'4221': '201602561'
'4222': '201602610'
'4223': '201602619'
'4224': '201602662'
'4225': '201602702'
'4226': '201602708'
'4227': '201602739'
'4228': '201602766'
'4229': '201602784'
'4230': '201602785'
'4231': '201602888'
'4232': '201602889'
'4233': '201602891'
'4234': '201602908'
'4235': '201602909'
'4236': '201602926'
'4237': '201602930'
'4238': '201602957'
'4239': '201602959'
'4240': '201602965'
'4241': '201602983'
'4242': '201602996'
'4243': '201603025'
'4244': '201603032'
'4245': '201603033'
'4246': '201603057'
'4247': '201603058'
'4248': '201603059'
'4249': '201603073'
'4250': '201603074'
'4251': '201603076'
'4252': '201603084'
'4253': '201603097'
'4254': '201603104'
'4255': '201603127'
'4256': '201603130'
'4257': '201603137'
'4258': '201603147'
'4259': '201603150'
'4260': '201603151'
'4261': '201603152'
'4262': '201603161'
'4263': '201603173'
'4264': '201603178'
'4265': '201603226'
'4266': '201603227'
'4267': '201603228'
'4268': '201603229'
'4269': '201603298'
'4270': '201603314'
'4271': '201603315'
'4272': '201603316'
'4273': '201603317'
'4274': '201603320'
'4275': '201603340'
'4276': '201603363'
'4277': '201603369'
'4278': '201603370'
'4279': '201603416'
'4280': '201603432'
'4281': '201603454'
'4282': '201603455'
'4283': '201603462'
'4284': '201603468'
'4285': '201603493'
'4286': '201603497'
'4287': '201603514'
'4288': '201603530'
'4289': '201603552'
'4290': '201603608'
'4291': '201603615'
'4292': '201603628'
'4293': '201603633'
'4294': '201603644'
'4295': '201603663'
'4296': '201603669'
'4297': '201603670'
'4298': '201603681'
'4299': '201603682'
'4300': '201603683'
'4301': '201603692'
'4302': '201603728'
'4303': '201603729'
'4304': '201603775'
'4305': '201603787'
'4306': '201603794'
'4307': '201603795'
'4308': '201603796'
'4309': '201603807'
'4310': '201603883'
'4311': '201603890'
'4312': '201603892'
'4313': '201603894'
'4314': '201603895'
'4315': '201603899'
'4316': '201603919'
'4317': '201603920'
'4318': '201603929'
'4319': '201603931'
'4320': '201603988'
'4321': '201603989'
'4322': '201603995'
'4323': '201604010'
'4324': '201604018'
'4325': '201604034'
'4326': '201604054'
'4327': '201604060'
'4328': '201604075'
'4329': '201604076'
'4330': '201604117'
'4331': '201604118'
'4332': '201604119'
'4333': '201604147'
'4334': '201604148'
'4335': '201604152'
'4336': '201604178'
'4337': '201604202'
'4338': '201604209'
'4339': '201604247'
'4340': '201604290'
'4341': '201604348'
'4342': '201604367'
'4343': '201604375'
'4344': '201604409'
'4345': '201604410'
'4346': '201604411'
'4347': '201604429'
'4348': '201604461'
'4349': '201604485'
'4350': '201604621'
'4351': '201604622'
'4352': '201604623'
'4353': '201604628'
'4354': '201604631'
'4355': '201604632'
'4356': '201604669'
'4357': '201604677'
'4358': '201604678'
'4359': '201604687'
'4360': '201604696'
'4361': '201604705'
'4362': '201604823'
'4363': '201604942'
'4364': '201605007'
'4365': '201605011'
'4366': '201605135'
'4367': '201605175'
'4368': '201605323'
'4369': '201605325'
'4370': '201605340'
'4371': '201605497'
'4372': '201605721'
'4373': '201605722'
'4374': '201605723'
'4375': '201605749'
'4376': '201605790'
'4377': '201605791'
'4378': '201605792'
'4379': '201605801'
'4380': '201605802'
'4381': '201605803'
'4382': '201605804'
'4383': '201605805'
'4384': '201605806'
'4385': '201605819'
'4386': '201605820'
'4387': '201605821'
'4388': '201605845'
'4389': '201605851'
'4390': '201605853'
'4391': '201605860'
'4392': '201605978'
'4393': '201606046'
'4394': '201606047'
'4395': '201606048'
'4396': '201606049'
'4397': '201606050'
'4398': '201606051'
'4399': '201606053'
'4400': '201606081'
'4401': '201606123'
'4402': '201606189'
'4403': '201606254'
'4404': '201606256'
'4405': '201606264'
'4406': '201606306'
'4407': '201606307'
'4408': '201606380'
'4409': '201606381'
'4410': '201606391'
'4411': '201606394'
'4412': '201606395'
'4413': '201606401'
'4414': '201606419'
'4415': '201606563'
'4416': '201606564'
'4417': '201606593'
'4418': '201606695'
'4419': '201606770'
'4420': '201606772'
'4421': '201606795'
'4422': '201606799'
'4423': '201606818'
'4424': '201606829'
'4425': '201606935'
'4426': '201607019'
'4427': '201607041'
'4428': '201607042'
'4429': '201607043'
'4430': '201607093'
'4431': '201607205'
'4432': '201607206'
'4433': '201607211'
'4434': '201607298'
'4435': '201607299'
'4436': '201700144'
'4437': '201700156'
'4438': '201700243'
'4439': '201700244'
'4440': '201700254'
'4441': '201700377'
'4442': '201700390'
'4443': '201700402'
'4444': '201700404'
'4445': '201700418'
'4446': '201700452'
'4447': '201700463'
'4448': '201700518'
'4449': '201700619'
'4450': '201700620'
'4451': '201700621'
'4452': '201700623'
'4453': '201700684'
'4454': '201700730'
'4455': '201700836'
'4456': '201700897'
'4457': '201700900'
'4458': '201700904'
'4459': '201700906'
'4460': '201700921'
'4461': '201701040'
'4462': '201701044'
'4463': '201701157'
'4464': '201701163'
'4465': '201701169'
'4466': '201701170'
'4467': '201701194'
'4468': '201701196'
'4469': '201701198'
'4470': '201701233'
'4471': '201701288'
'4472': '201701289'
'4473': '201701296'
'4474': '201701303'
'4475': '201701339'
'4476': '201701340'
'4477': '201701380'
'4478': '201701414'
'4479': '201701415'
'4480': '201701435'
'4481': '201701447'
'4482': '201701456'
'4483': '201701458'
'4484': '201701471'
'4485': '201701478'
'4486': '201701479'
'4487': '201701501'
'4488': '201701538'
'4489': '201701544'
'4490': '201701545'
'4491': '201701546'
'4492': '201701551'
'4493': '201701574'
'4494': '201701611'
'4495': '201701717'
'4496': '201701749'
'4497': '201701750'
'4498': '201701865'
'4499': '201701878'
'4500': '201701879'
'4501': '201701882'
'4502': '201701960'
'4503': '201702166'
'4504': '201702167'
'4505': '201702168'
'4506': '201702204'
'4507': '201702236'
'4508': '201702237'
'4509': '201702238'
'4510': '201702244'
'4511': '201702246'
'4512': '201702256'
'4513': '201702259'
'4514': '201702281'
'4515': '201702386'
'4516': '201702387'
'4517': '201702390'
'4518': '201702397'
'4519': '201702398'
'4520': '201702399'
'4521': '201702401'
'4522': '201702402'
'4523': '201702403'
'4524': '201702404'
'4525': '201702485'
'4526': '201702492'
'4527': '201702493'
'4528': '201702508'
'4529': '201702526'
'4530': '201702527'
'4531': '201702528'
'4532': '201702541'
'4533': '201702542'
'4534': '201702577'
'4535': '201702610'
'4536': '201702624'
'4537': '201702688'
'4538': '201702735'
'4539': '201702768'
'4540': '201702781'
'4541': '201702782'
'4542': '201702783'
'4543': '201702795'
'4544': '201703991'
'4545': '201703995'
'4546': '201704017'
'4547': '201704542'
'4548': '201704726'
'4549': '201704753'
'4550': '201704761'
'4551': '201704762'
'4552': '201704782'
'4553': '201704786'
'4554': '201704833'
'4555': '201704879'
'4556': '201705029'
'4557': '201705044'
'4558': '201705174'
'4559': '201705175'
'4560': '201705206'
'4561': '201705217'
'4562': '201705219'
'4563': '201705220'
'4564': '201705221'
'4565': '201705254'
'4566': '201705255'
'4567': '201705467'
'4568': '201705468'
'4569': '201705487'
'4570': '201705488'
'4571': '201705491'
'4572': '201705673'
'4573': '201705684'
'4574': '201705685'
'4575': '201705686'
'4576': '201705687'
'4577': '201705688'
'4578': '201705689'
'4579': '201705789'
'4580': '201705790'
'4581': '201705809'
'4582': '201705811'
'4583': '201706137'
'4584': '201706144'
'4585': '201706188'
'4586': '201706209'
'4587': '201706210'
'4588': '201706211'
'4589': '201706212'
'4590': '201706221'
'4591': '201706237'
'4592': '201706344'
'4593': '201706345'
'4594': '201706353'
'4595': '201706355'
'4596': '201706473'
'4597': '201706474'
'4598': '201706555'
'4599': '201706556'
'4600': '201706557'
'4601': '201706560'
'4602': '201706564'
'4603': '201706565'
'4604': '201706571'
'4605': '201706572'
'4606': '201706573'
'4607': '201706574'
'4608': '201706575'
'4609': '201706583'
'4610': '201706584'
'4611': '201706641'
'4612': '201706642'
'4613': '201706690'
'4614': '201706734'
'4615': '201706749'
'4616': '201706750'
'4617': '201706753'
'4618': '201706786'
'4619': '201706787'
'4620': '201706788'
'4621': '201706789'
'4622': '201706790'
'4623': '201706791'
'4624': '201706798'
'4625': '201706814'
'4626': '201706821'
'4627': '201706825'
'4628': '201706845'
'4629': '201706853'
'4630': '201706858'
'4631': '201706866'
'4632': '201706889'
'4633': '201706890'
'4634': '201706899'
'4635': '201706900'
'4636': '201706969'
'4637': '201707021'
'4638': '201707062'
'4639': '201707098'
'4640': '201707167'
'4641': '201707212'
'4642': '201707216'
'4643': '201707217'
'4644': '201707234'
'4645': '201707244'
'4646': '201707245'
'4647': '201707248'
'4648': '201707295'
'4649': '201707296'
'4650': '201707302'
'4651': '201707311'
'4652': '201707327'
'4653': '201707350'
'4654': '201707352'
'4655': '201707392'
'4656': '201707443'
'4657': '201707526'
'4658': '201707534'
'4659': '201707535'
'4660': '201707539'
'4661': '201707601'
'4662': '201707616'
'4663': '201707639'
'4664': '201707644'
'4665': '201707649'
'4666': '201707653'
'4667': '201707658'
'4668': '201707659'
'4669': '201707660'
'4670': '201707669'
'4671': '201707674'
'4672': '201707675'
'4673': '201707837'
'4674': '201707838'
'4675': '201707853'
'4676': '201707860'
'4677': '201707937'
'4678': '201707938'
'4679': '201707939'
'4680': '201707944'
'4681': '201708018'
'4682': '201708021'
'4683': '201708034'
'4684': '201708043'
'4685': '201708044'
'4686': '201708045'
'4687': '201708055'
'4688': '201708069'
'4689': '201708080'
'4690': '201708083'
'4691': '201708085'
'4692': '201708087'
'4693': '201708091'
'4694': '201708095'
'4695': '201708096'
'4696': '201708110'
'4697': '201708131'
'4698': '201708142'
'4699': '201708144'
'4700': '201708159'
'4701': '201708184'
'4702': '201708188'
'4703': '201708197'
'4704': '201708198'
'4705': '201708199'
'4706': '201708204'
'4707': '201708219'
'4708': '201708227'
'4709': '201708233'
'4710': '201708277'
'4711': '201708280'
'4712': '201708281'
'4713': '201708331'
'4714': '201708341'
'4715': '201708362'
'4716': '201708367'
'4717': '201708368'
'4718': '201708370'
'4719': '201708380'
'4720': '201708416'
'4721': '201708419'
'4722': '201708420'
'4723': '201708422'
'4724': '201708434'
'4725': '201708449'
'4726': '201708450'
'4727': '201708456'
'4728': '201708459'
'4729': '201708465'
'4730': '201708466'
'4731': '201708476'
'4732': '201708478'
'4733': '201708482'
'4734': '201708490'
'4735': '201708492'
'4736': '201708494'
'4737': '201708498'
'4738': '201708499'
'4739': '201708532'
'4740': '201708533'
'4741': '201708542'
'4742': '201708549'
'4743': '201708551'
'4744': '201708555'
'4745': '201708559'
'4746': '201708569'
'4747': '201708570'
'4748': '201708571'
'4749': '201708577'
'4750': '201708578'
'4751': '201708579'
'4752': '201708581'
'4753': '201708582'
'4754': '201708583'
'4755': '201708591'
'4756': '201708594'
'4757': '201708597'
'4758': '201708604'
'4759': '201708606'
'4760': '201708609'
'4761': '201708615'
'4762': '201708620'
'4763': '201708621'
'4764': '201708622'
'4765': '201708628'
'4766': '201708633'
'4767': '201708634'
'4768': '201708635'
'4769': '201708636'
'4770': '201708642'
'4771': '201800016'
'4772': '201800048'
'4773': '201800067'
'4774': '201800087'
'4775': '201800100'
'4776': '201800123'
'4777': '201800128'
'4778': '201800133'
'4779': '201800142'
'4780': '201800184'
'4781': '201800191'
'4782': '201800217'
'4783': '201800228'
'4784': '201800229'
'4785': '201800231'
'4786': '201800239'
'4787': '201800286'
'4788': '201800298'
'4789': '201800299'
'4790': '201800300'
'4791': '201800302'
'4792': '201800304'
'4793': '201800305'
'4794': '201800306'
'4795': '201800307'
'4796': '201800308'
'4797': '201800313'
'4798': '201800314'
'4799': '201800317'
'4800': '201800318'
'4801': '201800319'
'4802': '201800320'
'4803': '201800344'
'4804': '201800496'
'4805': '201800499'
'4806': '201800500'
'4807': '201800503'
'4808': '201800504'
'4809': '201800506'
'4810': '201800507'
'4811': '201800511'
'4812': '201800512'
'4813': '201800513'
'4814': '201800530'
'4815': '201800531'
'4816': '201800546'
'4817': '201800567'
'4818': '201800568'
'4819': '201800576'
'4820': '201800600'
'4821': '201800601'
'4822': '201800609'
'4823': '201800638'
'4824': '201800639'
'4825': '201800640'
'4826': '201800648'
'4827': '201800668'
'4828': '201800675'
'4829': '201800701'
'4830': '201800718'
'4831': '201800764'
'4832': '201800767'
'4833': '201800768'
'4834': '201800791'
'4835': '201800807'
'4836': '201800810'
'4837': '201800811'
'4838': '201800813'
'4839': '201800836'
'4840': '201800860'
'4841': '201800888'
'4842': '201800889'
'4843': '201800907'
'4844': '201800910'
'4845': '201800924'
'4846': '201800930'
'4847': '201800935'
'4848': '201800936'
'4849': '201800937'
'4850': '201800950'
'4851': '201800964'
'4852': '201800992'
'4853': '201801001'
'4854': '201801002'
'4855': '201801008'
'4856': '201801071'
'4857': '201801073'
'4858': '201801087'
'4859': '201801145'
'4860': '201801152'
'4861': '201801153'
'4862': '201801175'
'4863': '201801179'
'4864': '201801224'
'4865': '201801267'
'4866': '201801271'
'4867': '201801291'
'4868': '201801346'
'4869': '201801353'
'4870': '201801354'
'4871': '201801386'
'4872': '201801388'
'4873': '201801415'
'4874': '201801418'
'4875': '201801429'
'4876': '201801430'
'4877': '201801431'
'4878': '201801432'
'4879': '201801433'
'4880': '201801516'
'4881': '201801580'
'4882': '201801617'
'4883': '201801680'
'4884': '201801687'
'4885': '201801715'
'4886': '201801725'
'4887': '201801763'
'4888': '201801764'
'4889': '201801765'
'4890': '201801963'
'4891': '201802050'
'4892': '201802103'
'4893': '201802129'
'4894': '201802146'
'4895': '201802226'
'4896': '201802227'
'4897': '201802228'
'4898': '201802255'
'4899': '201802335'
'4900': '201802339'
'4901': '201802378'
'4902': '201802463'
'4903': '201802579'
'4904': '201802586'
'4905': '201802641'
'4906': '201802652'
'4907': '201802697'
'4908': '201802700'
'4909': '201802737'
'4910': '201802751'
'4911': '201802763'
'4912': '201802764'
'4913': '201802818'
'4914': '201802894'
'4915': '201802909'
'4916': '201803000'
'4917': '201803017'
'4918': '201803041'
'4919': '201803060'
'4920': '201803093'
'4921': '201803122'
'4922': '201803136'
'4923': '201803164'
'4924': '201803165'
'4925': '201803166'
'4926': '201803189'
'4927': '201803233'
'4928': '201803298'
'4929': '201803304'
'4930': '201803367'
'4931': '201803379'
'4932': '201803380'
'4933': '201803500'
'4934': '201803502'
'4935': '201803516'
'4936': '201803529'
'4937': '201803530'
'4938': '201803606'
'4939': '201803649'
'4940': '201803666'
'4941': '201803667'
'4942': '201803677'
'4943': '201803699'
'4944': '201803704'
'4945': '201803731'
'4946': '201803777'
'4947': '201803799'
'4948': '201803815'
'4949': '201803824'
'4950': '201803844'
'4951': '201803853'
'4952': '201803880'
'4953': '201803881'
'4954': '201803887'
'4955': '201803888'
'4956': '201803889'
'4957': '201803896'
'4958': '201803899'
'4959': '201804023'
'4960': '201804024'
'4961': '201804149'
'4962': '201804162'
'4963': '201804237'
'4964': '201804271'
'4965': '201804306'
'4966': '201804307'
'4967': '201804308'
'4968': '201804313'
'4969': '201804352'
'4970': '201804441'
'4971': '201804442'
'4972': '201804443'
'4973': '201804479'
'4974': '201804518'
'4975': '201804572'
'4976': '201804690'
'4977': '201804758'
'4978': '201804760'
'4979': '201804826'
'4980': '201804896'
'4981': '201804988'
'4982': '201805048'
'4983': '201805182'
'4984': '201805183'
'4985': '201805293'
'4986': '201805294'
'4987': '201805295'
'4988': '201805337'
'4989': '201805338'
'4990': '201805340'
'4991': '201805341'
'4992': '201805342'
'4993': '201805351'
'4994': '201900011'
'4995': '201900112'
'4996': '201900142'
'4997': '201900143'
'4998': '201900146'
'4999': '201900219'
'5000': '201900220'
'5001': '201900226'
'5002': '201900267'
'5003': '201900303'
'5004': '201900304'
'5005': '201900321'
'5006': '201900595'
'5007': '201900606'
'5008': '201900630'
'5009': '201900645'
'5010': '201900672'
'5011': '201900673'
'5012': '201900688'
'5013': '201900689'
'5014': '201900713'
'5015': '201900734'
'5016': '201900756'
'5017': '201900766'
'5018': '201900875'
'5019': '201900973'
'5020': '201901014'
'5021': '201901301'
'5022': '201901302'
'5023': '201901457'
'5024': '201901642'
'5025': '201901787'
'5026': '201901858'
'5027': '201901859'
'5028': '201901918'
'5029': '201901919'
'5030': '201902042'
'5031': '201902055'
'5032': '201902071'
'5033': '201902113'
'5034': '201902114'
'5035': '201902200'
'5036': '201902470'
'5037': '201902565'
'5038': '201902754'
'5039': '201902773'
'5040': '201902819'
'5041': '201903304'
'5042': '201903597'
'5043': '201903815'
'5044': '201904168'
'5045': '201904204'
'5046': '201904276'
'5047': '201904421'
'5048': '201904442'
'5049': '201904444'
'5050': '201904825'
'5051': '201905085'
'5052': '201905306'
'5053': '201905460'
'5054': '201905619'
'5055': '201905927'
'5056': '201906789'
'5057': '201906790'
'5058': '201906791'
'5059': '201906797'
'5060': '201906798'
'5061': '201906879'
'5062': '201906921'
'5063': '201906965'
'5064': '201906970'
'5065': '201906992'
'5066': '201907134'
'5067': '201907188'
'5068': '201907198'
'5069': '201907289'
'5070': '201907290'
'5071': '201907386'
'5072': '201907387'
'5073': '201907465'
'5074': '201907466'
'5075': '201907531'
'5076': '201907532'
'5077': '201907533'
'5078': '201907576'
'5079': '201907581'
'5080': '201907607'
'5081': '201907624'
'5082': '201907625'
'5083': '201907803'
'5084': '201908003'
'5085': '201908132'
'5086': '201908146'
'5087': '201908339'
splits:
- name: train
num_bytes: 1099731912557.417
num_examples: 206199
download_size: 1006743681478
dataset_size: 1099731912557.417
- config_name: server-image
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 2686047676.656
num_examples: 2578
download_size: 3930960903
dataset_size: 2686047676.656
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: origin_image
data_files:
- split: train
path: origin_image/train-*
- config_name: server-image
data_files:
- split: train
path: server-image/train-*
---
|
open-llm-leaderboard/details_psmathur__orca_mini_v2_13b | ---
pretty_name: Evaluation run of psmathur/orca_mini_v2_13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [psmathur/orca_mini_v2_13b](https://huggingface.co/psmathur/orca_mini_v2_13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__orca_mini_v2_13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T06:53:23.116359](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v2_13b/blob/main/results_2023-09-23T06-53-23.116359.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.06040268456375839,\n\
\ \"em_stderr\": 0.002439712523172895,\n \"f1\": 0.14132655201342267,\n\
\ \"f1_stderr\": 0.0028412840520175065,\n \"acc\": 0.39529928978029205,\n\
\ \"acc_stderr\": 0.009624116592337587\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.06040268456375839,\n \"em_stderr\": 0.002439712523172895,\n\
\ \"f1\": 0.14132655201342267,\n \"f1_stderr\": 0.0028412840520175065\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06368460955269144,\n \
\ \"acc_stderr\": 0.00672621307880572\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7269139700078927,\n \"acc_stderr\": 0.012522020105869456\n\
\ }\n}\n```"
repo_url: https://huggingface.co/psmathur/orca_mini_v2_13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|arc:challenge|25_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|arc:challenge|25_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T06_53_23.116359
path:
- '**/details_harness|drop|3_2023-09-23T06-53-23.116359.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T06-53-23.116359.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T06_53_23.116359
path:
- '**/details_harness|gsm8k|5_2023-09-23T06-53-23.116359.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T06-53-23.116359.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hellaswag|10_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hellaswag|10_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:28:41.797658.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:59:20.126364.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T19:28:41.797658.parquet'
- split: 2023_08_09T09_59_20.126364
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T09:59:20.126364.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T09:59:20.126364.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T06_53_23.116359
path:
- '**/details_harness|winogrande|5_2023-09-23T06-53-23.116359.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T06-53-23.116359.parquet'
- config_name: results
data_files:
- split: 2023_07_19T19_28_41.797658
path:
- results_2023-07-19T19:28:41.797658.parquet
- split: 2023_08_09T09_59_20.126364
path:
- results_2023-08-09T09:59:20.126364.parquet
- split: 2023_09_23T06_53_23.116359
path:
- results_2023-09-23T06-53-23.116359.parquet
- split: latest
path:
- results_2023-09-23T06-53-23.116359.parquet
---
# Dataset Card for Evaluation run of psmathur/orca_mini_v2_13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/orca_mini_v2_13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/orca_mini_v2_13b](https://huggingface.co/psmathur/orca_mini_v2_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__orca_mini_v2_13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T06:53:23.116359](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v2_13b/blob/main/results_2023-09-23T06-53-23.116359.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.06040268456375839,
"em_stderr": 0.002439712523172895,
"f1": 0.14132655201342267,
"f1_stderr": 0.0028412840520175065,
"acc": 0.39529928978029205,
"acc_stderr": 0.009624116592337587
},
"harness|drop|3": {
"em": 0.06040268456375839,
"em_stderr": 0.002439712523172895,
"f1": 0.14132655201342267,
"f1_stderr": 0.0028412840520175065
},
"harness|gsm8k|5": {
"acc": 0.06368460955269144,
"acc_stderr": 0.00672621307880572
},
"harness|winogrande|5": {
"acc": 0.7269139700078927,
"acc_stderr": 0.012522020105869456
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-98000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1069998
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
veereshd/Dreambooth_food_dataset | ---
license: unknown
---
|
abideen/ultrachat-uncensored-10k | ---
dataset_info:
features:
- name: id
dtype: string
- name: data
sequence: string
splits:
- name: train
num_bytes: 57325369
num_examples: 10000
download_size: 28939255
dataset_size: 57325369
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
0zAND1z/leap | ---
language:
- en
thumbnail: url to a thumbnail used in social sharing
tags:
- price
- analysis
license: unlicense
pretty_name: leap
size_categories:
- n<1K
---
# leap
Experimental dataset for price performance of top cryptocurrencies.
This dataset is purely a research material and should not be be considered as an investment memorandum or financial advise.
## Acknowledgements
The token price data is sourced by calling the DefiLlama API. The API is available at https://defillama.com/docs/api.
---
license: unlicense
--- |
ZLSCompLing/validator_38h | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: 'Unnamed: 0'
dtype: int64
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 4941483067.266
num_examples: 22398
- name: test
num_bytes: 47511110.0
num_examples: 239
- name: validation
num_bytes: 244663633.339
num_examples: 1193
download_size: 4781822310
dataset_size: 5233657810.605
---
# Dataset Card for "validator_38h"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MythicalStats/videos | ---
license: openrail
---
|
SargeZT/coco-stuff-captioned-multi | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: segmented
dtype: image
- name: caption
dtype: string
- name: gray_image
dtype: image
- name: softedge
dtype: image
- name: depth
dtype: image
- name: canny
dtype: image
- name: binary
dtype: image
- name: color
dtype: image
splits:
- name: test
num_bytes: 6925042.0
num_examples: 8
- name: train
num_bytes: 7013965619.0
num_examples: 9000
download_size: 7008916049
dataset_size: 7020890661.0
---
# Dataset Card for "coco-stuff-captioned-multi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ryzzlestrizzle/multi-wiki-clustering-p2p | ---
license: cc-by-4.0
---
|
xaviviro/oasst2_es | ---
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: float64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: value
sequence: float64
splits:
- name: validation
num_bytes: 7416754
num_examples: 6598
- name: train
num_bytes: 145972634
num_examples: 128574
download_size: 54183797
dataset_size: 153389388
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
license: apache-2.0
language:
- es
--- |
daisysxm76/StereosetData | ---
license: mit
---
|
sudarsa/tts_ | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: speaker_id
dtype: int64
splits:
- name: train
num_bytes: 42048707.0
num_examples: 230
download_size: 40396922
dataset_size: 42048707.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gustavst2/dataset | ---
license: openrail
---
|
Short-Answer-Feedback/saf_legal_domain_german | ---
pretty_name: SAF - Legal Domain - German
annotations_creators:
- expert-generated
language:
- de
language_creators:
- other
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
tags:
- short answer feedback
- legal domain
task_categories:
- text2text-generation
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: reference_answer
dtype: string
- name: provided_answer
dtype: string
- name: answer_feedback
dtype: string
- name: verification_feedback
dtype: string
- name: error_class
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 2142112
num_examples: 1596
- name: validation
num_bytes: 550206
num_examples: 400
- name: test_unseen_answers
num_bytes: 301087
num_examples: 221
- name: test_unseen_questions
num_bytes: 360616
num_examples: 275
download_size: 484808
dataset_size: 3354021
license: cc-by-4.0
---
# Dataset Card for "saf_legal_domain_german"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Contributions](#contributions)
## Dataset Description
### Dataset Summary
This Short Answer Feedback (SAF) dataset contains 19 German questions in the domain of the German social law (with reference answers). The idea of constructing a bilingual (English and German) short answer dataset as a way to remedy the lack of content-focused feedback datasets was introduced in [Your Answer is Incorrect... Would you like to know why? Introducing a Bilingual Short Answer Feedback Dataset](https://aclanthology.org/2022.acl-long.587) (Filighera et al., ACL 2022). Please refer to [saf_micro_job_german](https://huggingface.co/datasets/Short-Answer-Feedback/saf_micro_job_german) and [saf_communication_networks_english](https://huggingface.co/datasets/Short-Answer-Feedback/saf_communication_networks_english) for similarly constructed datasets that can be used for SAF tasks.
### Supported Tasks and Leaderboards
- `short_answer_feedback`: The dataset can be used to train a Text2Text Generation model from HuggingFace transformers in order to generate automatic short answer feedback.
### Languages
The questions, reference answers, provided answers and the answer feedback in the dataset are written in German.
## Dataset Structure
### Data Instances
An example of an entry of the training split looks as follows.
```
{
"id": "1",
"question": "Ist das eine Frage?",
"reference_answer": "Ja, das ist eine Frage.",
"provided_answer": "Ich bin mir sicher, dass das eine Frage ist.",
"answer_feedback": "Korrekt.",
"verification_feedback": "Correct",
"error_class": "Keine",
"score": 1
}
```
### Data Fields
The data fields are the same among all splits.
- `id`: a `string` feature (UUID4 in HEX format).
- `question`: a `string` feature representing a question.
- `reference_answer`: a `string` feature representing a reference answer to the question.
- `provided_answer`: a `string` feature representing an answer that was provided for a particular question.
- `answer_feedback`: a `string` feature representing the feedback given to the provided answers.
- `verification_feedback`: a `string` feature representing an automatic labeling of the score. It can be `Correct` (`score` = 1), `Incorrect` (`score` = 0) or `Partially correct` (all intermediate scores).
- `error_class`: a `string` feature representing the type of error identified in the case of a not completely correct answer.
- `score`: a `float64` feature (between 0 and 1) representing the score given to the provided answer.
### Data Splits
The dataset is comprised of four data splits.
- `train`: used for training, contains a set of questions and the provided answers to them.
- `validation`: used for validation, contains a set of questions and the provided answers to them (derived from the original training set from which the data came from).
- `test_unseen_answers`: used for testing, contains unseen answers to the questions present in the `train` split.
- `test_unseen_questions`: used for testing, contains unseen questions that do not appear in the `train` split.
| Split |train|validation|test_unseen_answers|test_unseen_questions|
|-------------------|----:|---------:|------------------:|--------------------:|
|Number of instances| 1596| 400| 221| 275|
## Additional Information
### Contributions
Thanks to [@JohnnyBoy2103](https://github.com/JohnnyBoy2103) for adding this dataset. |
yezhengli9/wmt20-ta-en | ---
dataset_info:
features:
- name: id (string)
dtype: string
- name: translation (translation)
dtype: string
splits:
- name: train
num_bytes: 780653
num_examples: 997
download_size: 274301
dataset_size: 780653
---
# Dataset Card for "wmt20-ta-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_traversaal-ai__traversaal-2.5-Mistral-7B | ---
pretty_name: Evaluation run of traversaal-ai/traversaal-2.5-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [traversaal-ai/traversaal-2.5-Mistral-7B](https://huggingface.co/traversaal-ai/traversaal-2.5-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_traversaal-ai__traversaal-2.5-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T00:34:42.679909](https://huggingface.co/datasets/open-llm-leaderboard/details_traversaal-ai__traversaal-2.5-Mistral-7B/blob/main/results_2024-02-02T00-34-42.679909.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6274978626802366,\n\
\ \"acc_stderr\": 0.03223484025810943,\n \"acc_norm\": 0.6366437826897652,\n\
\ \"acc_norm_stderr\": 0.032931031908270264,\n \"mc1\": 0.3708690330477356,\n\
\ \"mc1_stderr\": 0.01690969358024882,\n \"mc2\": 0.5399700915456362,\n\
\ \"mc2_stderr\": 0.015353094182217303\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.014157022555407161,\n\
\ \"acc_norm\": 0.6621160409556314,\n \"acc_norm_stderr\": 0.013822047922283509\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6597291376219877,\n\
\ \"acc_stderr\": 0.00472831857783521,\n \"acc_norm\": 0.850229038040231,\n\
\ \"acc_norm_stderr\": 0.0035611748104545588\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878937,\n\
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878937\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881564,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881564\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612896,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612896\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.030500283176545847,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.030500283176545847\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.03957835471980981,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.03957835471980981\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32513966480446926,\n\
\ \"acc_stderr\": 0.01566654278505355,\n \"acc_norm\": 0.32513966480446926,\n\
\ \"acc_norm_stderr\": 0.01566654278505355\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.01274197433389723,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.01274197433389723\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898452,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898452\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495155,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495155\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786862,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786862\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3708690330477356,\n\
\ \"mc1_stderr\": 0.01690969358024882,\n \"mc2\": 0.5399700915456362,\n\
\ \"mc2_stderr\": 0.015353094182217303\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643412\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1652767247915087,\n \
\ \"acc_stderr\": 0.010231031118582121\n }\n}\n```"
repo_url: https://huggingface.co/traversaal-ai/traversaal-2.5-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|arc:challenge|25_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|gsm8k|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hellaswag|10_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T00-34-42.679909.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T00-34-42.679909.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- '**/details_harness|winogrande|5_2024-02-02T00-34-42.679909.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T00-34-42.679909.parquet'
- config_name: results
data_files:
- split: 2024_02_02T00_34_42.679909
path:
- results_2024-02-02T00-34-42.679909.parquet
- split: latest
path:
- results_2024-02-02T00-34-42.679909.parquet
---
# Dataset Card for Evaluation run of traversaal-ai/traversaal-2.5-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [traversaal-ai/traversaal-2.5-Mistral-7B](https://huggingface.co/traversaal-ai/traversaal-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_traversaal-ai__traversaal-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T00:34:42.679909](https://huggingface.co/datasets/open-llm-leaderboard/details_traversaal-ai__traversaal-2.5-Mistral-7B/blob/main/results_2024-02-02T00-34-42.679909.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6274978626802366,
"acc_stderr": 0.03223484025810943,
"acc_norm": 0.6366437826897652,
"acc_norm_stderr": 0.032931031908270264,
"mc1": 0.3708690330477356,
"mc1_stderr": 0.01690969358024882,
"mc2": 0.5399700915456362,
"mc2_stderr": 0.015353094182217303
},
"harness|arc:challenge|25": {
"acc": 0.6237201365187713,
"acc_stderr": 0.014157022555407161,
"acc_norm": 0.6621160409556314,
"acc_norm_stderr": 0.013822047922283509
},
"harness|hellaswag|10": {
"acc": 0.6597291376219877,
"acc_stderr": 0.00472831857783521,
"acc_norm": 0.850229038040231,
"acc_norm_stderr": 0.0035611748104545588
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878937,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878937
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881564,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881564
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612896,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612896
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545847,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545847
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980981,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980981
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32513966480446926,
"acc_stderr": 0.01566654278505355,
"acc_norm": 0.32513966480446926,
"acc_norm_stderr": 0.01566654278505355
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.01274197433389723,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.01274197433389723
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898452,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898452
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495155,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495155
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786862,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786862
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3708690330477356,
"mc1_stderr": 0.01690969358024882,
"mc2": 0.5399700915456362,
"mc2_stderr": 0.015353094182217303
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643412
},
"harness|gsm8k|5": {
"acc": 0.1652767247915087,
"acc_stderr": 0.010231031118582121
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Steelskull__Aethora-7b-v1 | ---
pretty_name: Evaluation run of Steelskull/Aethora-7b-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Steelskull/Aethora-7b-v1](https://huggingface.co/Steelskull/Aethora-7b-v1) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Steelskull__Aethora-7b-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T18:41:00.387218](https://huggingface.co/datasets/open-llm-leaderboard/details_Steelskull__Aethora-7b-v1/blob/main/results_2024-04-02T18-41-00.387218.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6158065192438972,\n\
\ \"acc_stderr\": 0.03286108070031622,\n \"acc_norm\": 0.6220087964557879,\n\
\ \"acc_norm_stderr\": 0.03353857534068535,\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5486805434095359,\n\
\ \"mc2_stderr\": 0.015016359359661927\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5477815699658704,\n \"acc_stderr\": 0.014544519880633823,\n\
\ \"acc_norm\": 0.5947098976109215,\n \"acc_norm_stderr\": 0.014346869060229316\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5770762796255726,\n\
\ \"acc_stderr\": 0.004930138842768225,\n \"acc_norm\": 0.793168691495718,\n\
\ \"acc_norm_stderr\": 0.004042057039394373\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493875,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.041307408795554966,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.041307408795554966\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239976,\n \"\
acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239976\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5221674876847291,\n \"acc_stderr\": 0.035145285621750066,\n \"\
acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.035145285621750066\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\"\
: 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723886,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723886\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.02483881198803316,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.02483881198803316\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n\
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.031204691225150023,\n\
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.031204691225150023\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.0165952597103993,\n \"acc_norm\"\
: 0.8165137614678899,\n \"acc_norm_stderr\": 0.0165952597103993\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n\
\ \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794087,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794087\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.0225090339370778,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.0225090339370778\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7994891443167306,\n\
\ \"acc_stderr\": 0.0143176537085942,\n \"acc_norm\": 0.7994891443167306,\n\
\ \"acc_norm_stderr\": 0.0143176537085942\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34413407821229053,\n\
\ \"acc_stderr\": 0.015889221313307094,\n \"acc_norm\": 0.34413407821229053,\n\
\ \"acc_norm_stderr\": 0.015889221313307094\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7395498392282959,\n\
\ \"acc_stderr\": 0.024926723224845543,\n \"acc_norm\": 0.7395498392282959,\n\
\ \"acc_norm_stderr\": 0.024926723224845543\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.02555765398186806,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.02555765398186806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236848,\n \
\ \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236848\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n\
\ \"acc_stderr\": 0.012731102790504519,\n \"acc_norm\": 0.46088657105606257,\n\
\ \"acc_norm_stderr\": 0.012731102790504519\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.030042615832714867,\n\
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.030042615832714867\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.630718954248366,\n \"acc_stderr\": 0.01952431674486635,\n \
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.01952431674486635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683906,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683906\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338733,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338733\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5486805434095359,\n\
\ \"mc2_stderr\": 0.015016359359661927\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.01157061486140935\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.34495830174374525,\n \
\ \"acc_stderr\": 0.013093630133666222\n }\n}\n```"
repo_url: https://huggingface.co/Steelskull/Aethora-7b-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|arc:challenge|25_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|gsm8k|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hellaswag|10_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-41-00.387218.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T18-41-00.387218.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- '**/details_harness|winogrande|5_2024-04-02T18-41-00.387218.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T18-41-00.387218.parquet'
- config_name: results
data_files:
- split: 2024_04_02T18_41_00.387218
path:
- results_2024-04-02T18-41-00.387218.parquet
- split: latest
path:
- results_2024-04-02T18-41-00.387218.parquet
---
# Dataset Card for Evaluation run of Steelskull/Aethora-7b-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Steelskull/Aethora-7b-v1](https://huggingface.co/Steelskull/Aethora-7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Steelskull__Aethora-7b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T18:41:00.387218](https://huggingface.co/datasets/open-llm-leaderboard/details_Steelskull__Aethora-7b-v1/blob/main/results_2024-04-02T18-41-00.387218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6158065192438972,
"acc_stderr": 0.03286108070031622,
"acc_norm": 0.6220087964557879,
"acc_norm_stderr": 0.03353857534068535,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.01693237055757063,
"mc2": 0.5486805434095359,
"mc2_stderr": 0.015016359359661927
},
"harness|arc:challenge|25": {
"acc": 0.5477815699658704,
"acc_stderr": 0.014544519880633823,
"acc_norm": 0.5947098976109215,
"acc_norm_stderr": 0.014346869060229316
},
"harness|hellaswag|10": {
"acc": 0.5770762796255726,
"acc_stderr": 0.004930138842768225,
"acc_norm": 0.793168691495718,
"acc_norm_stderr": 0.004042057039394373
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493875,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239976,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239976
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.035145285621750066,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.035145285621750066
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723886,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723886
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.02483881198803316,
"acc_norm": 0.6,
"acc_norm_stderr": 0.02483881198803316
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.031204691225150023,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.031204691225150023
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.0165952597103993,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.0165952597103993
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794087,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489122,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489122
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.0225090339370778,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.0225090339370778
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7994891443167306,
"acc_stderr": 0.0143176537085942,
"acc_norm": 0.7994891443167306,
"acc_norm_stderr": 0.0143176537085942
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34413407821229053,
"acc_stderr": 0.015889221313307094,
"acc_norm": 0.34413407821229053,
"acc_norm_stderr": 0.015889221313307094
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7395498392282959,
"acc_stderr": 0.024926723224845543,
"acc_norm": 0.7395498392282959,
"acc_norm_stderr": 0.024926723224845543
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.02555765398186806,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.02555765398186806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236848,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504519,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504519
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.030042615832714867,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.030042615832714867
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.01952431674486635,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.01952431674486635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338733,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338733
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.01693237055757063,
"mc2": 0.5486805434095359,
"mc2_stderr": 0.015016359359661927
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.01157061486140935
},
"harness|gsm8k|5": {
"acc": 0.34495830174374525,
"acc_stderr": 0.013093630133666222
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
danthareja/cognitive-distortion | ---
dataset_info:
features:
- name: id
dtype: int64
- name: ' patient_question'
dtype: string
- name: distorted_part
dtype: string
- name: dominant_distortion
dtype:
class_label:
names:
'0': All-or-nothing thinking
'1': Emotional Reasoning
'2': No Distortion
'3': Magnification
'4': Fortune-telling
'5': Personalization
'6': Should statements
'7': Overgeneralization
'8': Mind Reading
'9': Mental filter
'10': Labeling
splits:
- name: train
num_bytes: 2076672.8
num_examples: 2024
- name: test
num_bytes: 519168.2
num_examples: 506
download_size: 1579073
dataset_size: 2595841.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
williamberman/wikiart | ---
license: unknown
license_details: Data files © Original Authors
size_categories:
- 10K<n<100K
task_categories:
- image-classification
- text-to-image
- image-to-text
tags:
- art
---
## Dataset Description
- **Homepage:** https://www.wikiart.org/
### Dataset Summary
Dataset containing 81,444 pieces of visual art from various artists, taken from WikiArt.org,
along with class labels for each image :
* "artist" : 129 artist classes, including a "Unknown Artist" class
* "genre" : 11 genre classes, including a "Unknown Genre" class
* "style" : 27 style classes
On WikiArt.org, the description for the "Artworks by Genre" page reads :
A genre system divides artworks according to depicted themes and objects. A classical hierarchy of genres was developed in European culture by the 17th century. It ranked genres in high – history painting and portrait, - and low – genre painting, landscape and still life. This hierarchy was based on the notion of man as the measure of all things. Landscape and still life were the lowest because they did not involve human subject matter. History was highest because it dealt with the noblest events of humanity. Genre system is not so much relevant for a contemporary art; there are just two genre definitions that are usually applied to it: abstract or figurative.
The "Artworks by Style" page reads :
A style of an artwork refers to its distinctive visual elements, techniques and methods. It usually corresponds with an art movement or a school (group) that its author is associated with.
## Dataset Structure
* "image" : image
* "artist" : 129 artist classes, including a "Unknown Artist" class
* "genre" : 11 genre classes, including a "Unknown Genre" class
* "style" : 27 style classes
### Source Data
Files taken from this [archive](https://archive.org/download/wikiart-dataset/wikiart.tar.gz), curated from the [WikiArt website](https://www.wikiart.org/).
## Additional Information
Note:
* The WikiArt dataset can be used only for non-commercial research purpose.
* The images in the WikiArt dataset were obtained from WikiArt.org.
* The authors are neither responsible for the content nor the meaning of these images.
By using the WikiArt dataset, you agree to obey the terms and conditions of WikiArt.org.
### Contributions
[`gigant`](https://huggingface.co/gigant) added this dataset to the hub. |
nvidia/sft_datablend_v1 | ---
license: cc-by-4.0
task_categories:
- text-generation
---
# Dataset Card
This dataset is a blend of publicly available datasets for instruction tuning, including samples from OASST, CodeContests, FLAN, T0, Open_Platypus, and GSM8K.
Note that for datasets consisting of multiple subsets, we only include subsets with permissive license for commercial use.
As a data blend, some subsets may have been sampled for more than one epoch depending on sampling ratios and dataset sizes.
## Dataset
The dataset consists of four columns:
1. conversations: user and assistant turns in a conversational format
2. mask: the turns that losses are not calculated on ("User" by default)
3. system: system prompt (empty by default)
4. dataset: dataset source
## License
The detailed license information for all the data sources utilized in the blend are listed below.
It is usable for commercial purposes as long as you follow the terms of the licenses.
| Dataset Name | License Type
| -------- | -------- |
| [OASST](https://huggingface.co/datasets/OpenAssistant/oasst1) | Apache-2.0 |
| [CodeContests](https://github.com/google-deepmind/code_contests) | CC-BY-4.0 |
| [MNLI](https://huggingface.co/datasets/multi_nli) | OANC / Creative Commons Share-Alike 3.0 Unported / Creative Commons Attribution 3.0 Unported |
| [QNLI](https://gluebenchmark.com/tasks) | CC-BY-SA-4.0 |
| [WNLI](https://cs.nyu.edu/~davise/papers/WinogradSchemas/WS.html) | Creative Commons Attribution 4.0 International License |
| [BooLQ](https://huggingface.co/datasets/google/boolq) | CC-BY-SA-3.0 |
| [DROP](https://paperswithcode.com/dataset/drop) | CC-BY-SA-4.0 |
| [OpenbookQA](https://github.com/allenai/OpenBookQA) | Apache-2.0 |
| [SQuAD v1](https://paperswithcode.com/dataset/squad) | CC-BY-SA-4.0 |
| [SQuAD v2](https://paperswithcode.com/dataset/squad) | CC-BY-SA-4.0 |
| [COPA](https://people.ict.usc.edu/~gordon/copa.html) | BSD 2-Clause License |
| [HellaSwag](https://github.com/rowanz/hellaswag/blob/master) | MIT |
| [PIQA](https://yonatanbisk.com/piqa/) |Academic Free License (“AFL”) v. 3.0 |
| [StoryCloze](https://cs.rochester.edu/nlp/rocstories/) | [Custom](https://docs.google.com/forms/d/e/1FAIpQLSe83zPs21IGH9-HC1SuUa2hfyopJOHgTHft--Ne4SOj0VoViA/viewform?c=0&w=1) |
| [ARC](https://huggingface.co/datasets/ai2_arc) | CC-BY-SA-4.0 |
| [NQ](https://huggingface.co/datasets/nq_open) | CC-BY-SA-3.0 |
| [TriviaQA](https://github.com/mandarjoshi90/triviaqa) | Apache-2.0 |
| [Paws Wiki](https://github.com/google-research-datasets/paws) | [Custom](https://github.com/google-research-datasets/paws/blob/master/LICENSE) |
| [Winogrande](https://winogrande.allenai.org/) | CC-BY |
| [WSC273](https://cs.nyu.edu/~davise/papers/WinogradSchemas/WS.html) | Creative Commons Attribution 4.0 International License |
| [CosmosQA](https://wilburone.github.io/cosmos/) | CC-BY-4.0 |
| [ReCoRD CNN/Daily Mail](https://sheng-z.github.io/ReCoRD-explorer/) | Apache-2.0 |
| [DART](https://github.com/Yale-LILY/dart) | MIT |
| [E2ENLG](https://github.com/tuetschek/e2e-dataset) | CC-BY-SA-4.0 |
| [QuAC](https://quac.ai/) | CC-BY-SA-4.0 |
| [Mathematics](https://github.com/deepmind/mathematics_dataset) | Apache-2.0 |
| [SNLI](https://nlp.stanford.edu/projects/snli/) | CC-BY-SA-4.0 |
| [Adversarial QA](https://huggingface.co/datasets/adversarial_qa) | CC-BY-SA-4.0 |
| [Amazon Polarity](https://huggingface.co/datasets/amazon_polarity) | Apache-2.0 |
| [DBPedia](https://huggingface.co/datasets/dbpedia_14) | CC-BY-SA-3.0 |
| [DuoRC](https://huggingface.co/datasets/duorc) | MIT |
| [Hotpot QA](https://huggingface.co/datasets/kilt_tasks/viewer/hotpotqa) | MIT |
| [QASC](https://huggingface.co/datasets/qasc) | CC-BY-4.0 |
| [Quarel](https://allenai.org/data/quarell) | CC-BY |
| [QuaRTz](https://allenai.org/data/quartz) | CC-BY |
| [Quoref](https://huggingface.co/datasets/quoref) | CC-BY-4.0 |
| [ROPES](https://huggingface.co/datasets/ropes) | CC-BY-4.0 |
| [Social IQA](https://allenai.org/data/socialiqa) | CC-BY |
| [Wiki Bio](https://huggingface.co/datasets/wiki_bio) | CC-BY-SA-3.0 |
| [Wiki Hop](https://huggingface.co/datasets/wiki_hop) | CC-BY-SA-3.0 |
| [ARB](https://github.com/TheDuckAI/arb) | CC-BY-4.0 |
| [tigerbot-kaggle-leetcodesolutions-en-2k](https://huggingface.co/datasets/TigerResearch/tigerbot-kaggle-leetcodesolutions-en-2k) | Apache-2.0 |
| [SciBench](https://github.com/mandyyyyii/scibench) | MIT |
| [PRM800K](https://github.com/openai/prm800k) | MIT |
| [GSM8K](https://github.com/openai/grade-school-math) | MIT | |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.