datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
mlabonne/distilabel-intel-orca-dpo-pairs | ---
dataset_info:
features:
- name: system
dtype: string
- name: question
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: generations
sequence: string
- name: order
sequence: string
- name: labelling_model
dtype: string
- name: labelling_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_labelling_response
dtype: string
- name: rating
sequence: float64
- name: rationale
dtype: string
- name: status
dtype: string
- name: original_chosen
dtype: string
- name: original_rejected
dtype: string
- name: chosen_score
dtype: float64
- name: in_gsm8k_train
dtype: bool
- name: abs_difference
dtype: float64
splits:
- name: train
num_bytes: 75137131.0
num_examples: 5922
download_size: 36744794
dataset_size: 75137131.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- synthetic
- distilabel
---
|
open-llm-leaderboard/details_Technoculture__MT7Bi-alpha-dpo-v0.2 | ---
pretty_name: Evaluation run of Technoculture/MT7Bi-alpha-dpo-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Technoculture/MT7Bi-alpha-dpo-v0.2](https://huggingface.co/Technoculture/MT7Bi-alpha-dpo-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Technoculture__MT7Bi-alpha-dpo-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T12:50:13.790724](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__MT7Bi-alpha-dpo-v0.2/blob/main/results_2024-02-09T12-50-13.790724.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5274798937736077,\n\
\ \"acc_stderr\": 0.034244329313021585,\n \"acc_norm\": 0.5324573781856667,\n\
\ \"acc_norm_stderr\": 0.034973478659411146,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.45484028768936574,\n\
\ \"mc2_stderr\": 0.015178684073869702\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5204778156996587,\n \"acc_stderr\": 0.01459913135303501,\n\
\ \"acc_norm\": 0.5469283276450512,\n \"acc_norm_stderr\": 0.014546892052005628\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5714997012547302,\n\
\ \"acc_stderr\": 0.0049385003039902845,\n \"acc_norm\": 0.7589125672176857,\n\
\ \"acc_norm_stderr\": 0.004268690572638815\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365242,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365242\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319616,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319616\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.032555253593403555,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.032555253593403555\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5741935483870968,\n\
\ \"acc_stderr\": 0.028129112709165904,\n \"acc_norm\": 0.5741935483870968,\n\
\ \"acc_norm_stderr\": 0.028129112709165904\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244441,\n \"\
acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244441\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.032396370467357036,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.032396370467357036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4897435897435897,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.4897435897435897,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945284,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945284\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5168067226890757,\n \"acc_stderr\": 0.03246013680375308,\n \
\ \"acc_norm\": 0.5168067226890757,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.037804458505267334,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.037804458505267334\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7247706422018348,\n \"acc_stderr\": 0.019149093743155203,\n \"\
acc_norm\": 0.7247706422018348,\n \"acc_norm_stderr\": 0.019149093743155203\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608043,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236436,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236436\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n\
\ \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n\
\ \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.04345724570292534,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.04345724570292534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899615,\n\
\ \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899615\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n\
\ \"acc_stderr\": 0.027421007295392926,\n \"acc_norm\": 0.7735042735042735,\n\
\ \"acc_norm_stderr\": 0.027421007295392926\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7241379310344828,\n\
\ \"acc_stderr\": 0.01598281477469563,\n \"acc_norm\": 0.7241379310344828,\n\
\ \"acc_norm_stderr\": 0.01598281477469563\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.026362437574546545,\n\
\ \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.026362437574546545\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2558659217877095,\n\
\ \"acc_stderr\": 0.014593620923210723,\n \"acc_norm\": 0.2558659217877095,\n\
\ \"acc_norm_stderr\": 0.014593620923210723\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5627009646302251,\n\
\ \"acc_stderr\": 0.0281739177617629,\n \"acc_norm\": 0.5627009646302251,\n\
\ \"acc_norm_stderr\": 0.0281739177617629\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.02758600622160771,\n\
\ \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.02758600622160771\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3546099290780142,\n \"acc_stderr\": 0.02853865002887864,\n \
\ \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.02853865002887864\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38005215123859193,\n\
\ \"acc_stderr\": 0.012397328205137809,\n \"acc_norm\": 0.38005215123859193,\n\
\ \"acc_norm_stderr\": 0.012397328205137809\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.029624663581159703,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.029624663581159703\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5261437908496732,\n \"acc_stderr\": 0.020200164564804588,\n \
\ \"acc_norm\": 0.5261437908496732,\n \"acc_norm_stderr\": 0.020200164564804588\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.046737523336702384,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.046737523336702384\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.03093285879278986,\n\
\ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.03093285879278986\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6019900497512438,\n\
\ \"acc_stderr\": 0.03461199429040013,\n \"acc_norm\": 0.6019900497512438,\n\
\ \"acc_norm_stderr\": 0.03461199429040013\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.038743715565879536,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.038743715565879536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6491228070175439,\n \"acc_stderr\": 0.03660298834049163,\n\
\ \"acc_norm\": 0.6491228070175439,\n \"acc_norm_stderr\": 0.03660298834049163\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.45484028768936574,\n\
\ \"mc2_stderr\": 0.015178684073869702\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7158642462509865,\n \"acc_stderr\": 0.01267539278677272\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.25928733889310085,\n \
\ \"acc_stderr\": 0.012071405369905506\n }\n}\n```"
repo_url: https://huggingface.co/Technoculture/MT7Bi-alpha-dpo-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|arc:challenge|25_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|gsm8k|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hellaswag|10_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-50-13.790724.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T12-50-13.790724.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- '**/details_harness|winogrande|5_2024-02-09T12-50-13.790724.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T12-50-13.790724.parquet'
- config_name: results
data_files:
- split: 2024_02_09T12_50_13.790724
path:
- results_2024-02-09T12-50-13.790724.parquet
- split: latest
path:
- results_2024-02-09T12-50-13.790724.parquet
---
# Dataset Card for Evaluation run of Technoculture/MT7Bi-alpha-dpo-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Technoculture/MT7Bi-alpha-dpo-v0.2](https://huggingface.co/Technoculture/MT7Bi-alpha-dpo-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Technoculture__MT7Bi-alpha-dpo-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T12:50:13.790724](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__MT7Bi-alpha-dpo-v0.2/blob/main/results_2024-02-09T12-50-13.790724.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5274798937736077,
"acc_stderr": 0.034244329313021585,
"acc_norm": 0.5324573781856667,
"acc_norm_stderr": 0.034973478659411146,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.45484028768936574,
"mc2_stderr": 0.015178684073869702
},
"harness|arc:challenge|25": {
"acc": 0.5204778156996587,
"acc_stderr": 0.01459913135303501,
"acc_norm": 0.5469283276450512,
"acc_norm_stderr": 0.014546892052005628
},
"harness|hellaswag|10": {
"acc": 0.5714997012547302,
"acc_stderr": 0.0049385003039902845,
"acc_norm": 0.7589125672176857,
"acc_norm_stderr": 0.004268690572638815
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365242,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365242
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651282,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651282
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319616,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319616
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5741935483870968,
"acc_stderr": 0.028129112709165904,
"acc_norm": 0.5741935483870968,
"acc_norm_stderr": 0.028129112709165904
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.03383201223244441,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.03383201223244441
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.032396370467357036,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.032396370467357036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4897435897435897,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.4897435897435897,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945284,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5168067226890757,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.5168067226890757,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.037804458505267334,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.037804458505267334
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7247706422018348,
"acc_stderr": 0.019149093743155203,
"acc_norm": 0.7247706422018348,
"acc_norm_stderr": 0.019149093743155203
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236436,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236436
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.04345724570292534,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.04345724570292534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6257668711656442,
"acc_stderr": 0.03802068102899615,
"acc_norm": 0.6257668711656442,
"acc_norm_stderr": 0.03802068102899615
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7735042735042735,
"acc_stderr": 0.027421007295392926,
"acc_norm": 0.7735042735042735,
"acc_norm_stderr": 0.027421007295392926
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.01598281477469563,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.01598281477469563
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.026362437574546545,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.026362437574546545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2558659217877095,
"acc_stderr": 0.014593620923210723,
"acc_norm": 0.2558659217877095,
"acc_norm_stderr": 0.014593620923210723
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5627009646302251,
"acc_stderr": 0.0281739177617629,
"acc_norm": 0.5627009646302251,
"acc_norm_stderr": 0.0281739177617629
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.02758600622160771,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.02758600622160771
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.02853865002887864,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.02853865002887864
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38005215123859193,
"acc_stderr": 0.012397328205137809,
"acc_norm": 0.38005215123859193,
"acc_norm_stderr": 0.012397328205137809
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.029624663581159703,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.029624663581159703
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5261437908496732,
"acc_stderr": 0.020200164564804588,
"acc_norm": 0.5261437908496732,
"acc_norm_stderr": 0.020200164564804588
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.046737523336702384,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.046737523336702384
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.03093285879278986,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.03093285879278986
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6019900497512438,
"acc_stderr": 0.03461199429040013,
"acc_norm": 0.6019900497512438,
"acc_norm_stderr": 0.03461199429040013
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6491228070175439,
"acc_stderr": 0.03660298834049163,
"acc_norm": 0.6491228070175439,
"acc_norm_stderr": 0.03660298834049163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.45484028768936574,
"mc2_stderr": 0.015178684073869702
},
"harness|winogrande|5": {
"acc": 0.7158642462509865,
"acc_stderr": 0.01267539278677272
},
"harness|gsm8k|5": {
"acc": 0.25928733889310085,
"acc_stderr": 0.012071405369905506
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AppleHarem/shirayuki_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shirayuki (Arknights)
This is the dataset of shirayuki (Arknights), containing 58 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 58 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 151 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 167 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 58 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 58 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 58 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 151 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 151 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 132 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 167 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 167 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
irds/neumarco_fa_train_judged | ---
pretty_name: '`neumarco/fa/train/judged`'
viewer: false
source_datasets: ['irds/neumarco_fa', 'irds/neumarco_fa_train']
task_categories:
- text-retrieval
---
# Dataset Card for `neumarco/fa/train/judged`
The `neumarco/fa/train/judged` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/neumarco#neumarco/fa/train/judged).
# Data
This dataset provides:
- `queries` (i.e., topics); count=502,939
- For `docs`, use [`irds/neumarco_fa`](https://huggingface.co/datasets/irds/neumarco_fa)
- For `qrels`, use [`irds/neumarco_fa_train`](https://huggingface.co/datasets/irds/neumarco_fa_train)
- For `docpairs`, use [`irds/neumarco_fa_train`](https://huggingface.co/datasets/irds/neumarco_fa_train)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/neumarco_fa_train_judged', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
|
CatUkraine/minetest-screenshots1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 122904
num_examples: 81
download_size: 88945
dataset_size: 122904
tags:
- minetest
- image generation
---
# Dataset Card for "minetest-screenshots1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
image size: 64x64 |
nouman-10/test | ---
dataset_info:
features:
- name: image
dtype: image
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 2359545.254237288
num_examples: 170
download_size: 2345262
dataset_size: 2359545.254237288
---
# Dataset Card for "test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
galbitang/autotrain-data-jeongmi_chair | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: jeongmi_chair
## Dataset Description
This dataset has been automatically processed by AutoTrain for project jeongmi_chair.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<1000x1000 RGB PIL image>",
"target": 4
},
{
"image": "<700x700 RGB PIL image>",
"target": 6
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['classsicantique', 'frenchprovence', 'industrial', 'koreaaisa', 'lovelyromantic', 'minimalsimple', 'modern', 'natural', 'notherneurope', 'unique', 'vintatageretro'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 796 |
| valid | 204 |
|
J-Mourad/TokenizedMNAD.v2 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 1086299676
num_examples: 1351119
- name: validation
num_bytes: 119121444
num_examples: 148161
download_size: 725794557
dataset_size: 1205421120
---
# Dataset Card for "TokenizedMNAD.v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RayBernard/cumuluslinux | ---
license: mit
---
|
Pablao0948/Wynq | ---
license: openrail
---
|
open-llm-leaderboard/details_jondurbin__bagel-dpo-34b-v0.2 | ---
pretty_name: Evaluation run of jondurbin/bagel-dpo-34b-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/bagel-dpo-34b-v0.2](https://huggingface.co/jondurbin/bagel-dpo-34b-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__bagel-dpo-34b-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T04:16:58.738953](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__bagel-dpo-34b-v0.2/blob/main/results_2024-01-05T04-16-58.738953.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7613608627936146,\n\
\ \"acc_stderr\": 0.028274274385660204,\n \"acc_norm\": 0.7665014924179901,\n\
\ \"acc_norm_stderr\": 0.028800772478207726,\n \"mc1\": 0.5336597307221542,\n\
\ \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.7005121569261619,\n\
\ \"mc2_stderr\": 0.014305944779045657\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6902730375426621,\n \"acc_stderr\": 0.013512058415238363,\n\
\ \"acc_norm\": 0.7192832764505119,\n \"acc_norm_stderr\": 0.013131238126975578\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6579366660027883,\n\
\ \"acc_stderr\": 0.004734311435009195,\n \"acc_norm\": 0.8525194184425413,\n\
\ \"acc_norm_stderr\": 0.0035385967737048152\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.875,\n \"acc_stderr\": 0.026913523521537846,\n \
\ \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.026913523521537846\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n\
\ \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372274,\n\
\ \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372274\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9027777777777778,\n\
\ \"acc_stderr\": 0.024774516250440182,\n \"acc_norm\": 0.9027777777777778,\n\
\ \"acc_norm_stderr\": 0.024774516250440182\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\":\
\ 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n\
\ \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.7514450867052023,\n\
\ \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.774468085106383,\n \"acc_stderr\": 0.027321078417387536,\n\
\ \"acc_norm\": 0.774468085106383,\n \"acc_norm_stderr\": 0.027321078417387536\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n\
\ \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n\
\ \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7310344827586207,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.7310344827586207,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7142857142857143,\n \"acc_stderr\": 0.02326651221373057,\n \"\
acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.02326651221373057\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6190476190476191,\n\
\ \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.6190476190476191,\n\
\ \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9032258064516129,\n\
\ \"acc_stderr\": 0.016818943416345197,\n \"acc_norm\": 0.9032258064516129,\n\
\ \"acc_norm_stderr\": 0.016818943416345197\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6354679802955665,\n \"acc_stderr\": 0.0338640574606209,\n\
\ \"acc_norm\": 0.6354679802955665,\n \"acc_norm_stderr\": 0.0338640574606209\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\"\
: 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706456,\n\
\ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706456\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9242424242424242,\n \"acc_stderr\": 0.018852670234993093,\n \"\
acc_norm\": 0.9242424242424242,\n \"acc_norm_stderr\": 0.018852670234993093\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.011464523356953162,\n\
\ \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.011464523356953162\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8179487179487179,\n \"acc_stderr\": 0.0195652367829309,\n \
\ \"acc_norm\": 0.8179487179487179,\n \"acc_norm_stderr\": 0.0195652367829309\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4666666666666667,\n \"acc_stderr\": 0.030417716961717477,\n \
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.030417716961717477\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.024762902678057933,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.024762902678057933\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248437,\n \"\
acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248437\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9174311926605505,\n \"acc_stderr\": 0.01180036136301657,\n \"\
acc_norm\": 0.9174311926605505,\n \"acc_norm_stderr\": 0.01180036136301657\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6759259259259259,\n \"acc_stderr\": 0.03191923445686185,\n \"\
acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.03191923445686185\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \
\ \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.027157150479563824,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.027157150479563824\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540637,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540637\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n\
\ \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n\
\ \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553848,\n\
\ \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553848\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n\
\ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n\
\ \"acc_stderr\": 0.015006312806446912,\n \"acc_norm\": 0.9444444444444444,\n\
\ \"acc_norm_stderr\": 0.015006312806446912\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9054916985951469,\n\
\ \"acc_stderr\": 0.010461015338193071,\n \"acc_norm\": 0.9054916985951469,\n\
\ \"acc_norm_stderr\": 0.010461015338193071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8179190751445087,\n \"acc_stderr\": 0.020776761102512975,\n\
\ \"acc_norm\": 0.8179190751445087,\n \"acc_norm_stderr\": 0.020776761102512975\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8067039106145252,\n\
\ \"acc_stderr\": 0.013206868561343229,\n \"acc_norm\": 0.8067039106145252,\n\
\ \"acc_norm_stderr\": 0.013206868561343229\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8464052287581699,\n \"acc_stderr\": 0.020645597910418777,\n\
\ \"acc_norm\": 0.8464052287581699,\n \"acc_norm_stderr\": 0.020645597910418777\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8038585209003215,\n\
\ \"acc_stderr\": 0.022552447780478033,\n \"acc_norm\": 0.8038585209003215,\n\
\ \"acc_norm_stderr\": 0.022552447780478033\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.018877353839571842,\n\
\ \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.018877353839571842\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6276595744680851,\n \"acc_stderr\": 0.02883892147125145,\n \
\ \"acc_norm\": 0.6276595744680851,\n \"acc_norm_stderr\": 0.02883892147125145\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5788787483702738,\n\
\ \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.5788787483702738,\n\
\ \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8308823529411765,\n \"acc_stderr\": 0.022770868010113014,\n\
\ \"acc_norm\": 0.8308823529411765,\n \"acc_norm_stderr\": 0.022770868010113014\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.815359477124183,\n \"acc_stderr\": 0.01569702924075778,\n \
\ \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.01569702924075778\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.02366169917709861,\n\
\ \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.02366169917709861\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.02353755765789255,\n\
\ \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.02353755765789255\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5336597307221542,\n\
\ \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.7005121569261619,\n\
\ \"mc2_stderr\": 0.014305944779045657\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781086\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6095526914329037,\n \
\ \"acc_stderr\": 0.013437829864668583\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/bagel-dpo-34b-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|arc:challenge|25_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|arc:challenge|25_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|gsm8k|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|gsm8k|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hellaswag|10_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hellaswag|10_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-10-08.473090.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-16-58.738953.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T04-16-58.738953.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- '**/details_harness|winogrande|5_2024-01-05T04-10-08.473090.parquet'
- split: 2024_01_05T04_16_58.738953
path:
- '**/details_harness|winogrande|5_2024-01-05T04-16-58.738953.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T04-16-58.738953.parquet'
- config_name: results
data_files:
- split: 2024_01_05T04_10_08.473090
path:
- results_2024-01-05T04-10-08.473090.parquet
- split: 2024_01_05T04_16_58.738953
path:
- results_2024-01-05T04-16-58.738953.parquet
- split: latest
path:
- results_2024-01-05T04-16-58.738953.parquet
---
# Dataset Card for Evaluation run of jondurbin/bagel-dpo-34b-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jondurbin/bagel-dpo-34b-v0.2](https://huggingface.co/jondurbin/bagel-dpo-34b-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__bagel-dpo-34b-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T04:16:58.738953](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__bagel-dpo-34b-v0.2/blob/main/results_2024-01-05T04-16-58.738953.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7613608627936146,
"acc_stderr": 0.028274274385660204,
"acc_norm": 0.7665014924179901,
"acc_norm_stderr": 0.028800772478207726,
"mc1": 0.5336597307221542,
"mc1_stderr": 0.017463793867168106,
"mc2": 0.7005121569261619,
"mc2_stderr": 0.014305944779045657
},
"harness|arc:challenge|25": {
"acc": 0.6902730375426621,
"acc_stderr": 0.013512058415238363,
"acc_norm": 0.7192832764505119,
"acc_norm_stderr": 0.013131238126975578
},
"harness|hellaswag|10": {
"acc": 0.6579366660027883,
"acc_stderr": 0.004734311435009195,
"acc_norm": 0.8525194184425413,
"acc_norm_stderr": 0.0035385967737048152
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.875,
"acc_stderr": 0.026913523521537846,
"acc_norm": 0.875,
"acc_norm_stderr": 0.026913523521537846
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372274,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9027777777777778,
"acc_stderr": 0.024774516250440182,
"acc_norm": 0.9027777777777778,
"acc_norm_stderr": 0.024774516250440182
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387536,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387536
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7310344827586207,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.7310344827586207,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.02326651221373057,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.02326651221373057
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6190476190476191,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.6190476190476191,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9032258064516129,
"acc_stderr": 0.016818943416345197,
"acc_norm": 0.9032258064516129,
"acc_norm_stderr": 0.016818943416345197
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706456,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706456
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9242424242424242,
"acc_stderr": 0.018852670234993093,
"acc_norm": 0.9242424242424242,
"acc_norm_stderr": 0.018852670234993093
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.011464523356953162,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.011464523356953162
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8179487179487179,
"acc_stderr": 0.0195652367829309,
"acc_norm": 0.8179487179487179,
"acc_norm_stderr": 0.0195652367829309
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.030417716961717477,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.030417716961717477
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.024762902678057933,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.024762902678057933
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9174311926605505,
"acc_stderr": 0.01180036136301657,
"acc_norm": 0.9174311926605505,
"acc_norm_stderr": 0.01180036136301657
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.03191923445686185,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.03191923445686185
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.027157150479563824,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.027157150479563824
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540637,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540637
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563275,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.026845765054553848,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.026845765054553848
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446912,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446912
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9054916985951469,
"acc_stderr": 0.010461015338193071,
"acc_norm": 0.9054916985951469,
"acc_norm_stderr": 0.010461015338193071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8179190751445087,
"acc_stderr": 0.020776761102512975,
"acc_norm": 0.8179190751445087,
"acc_norm_stderr": 0.020776761102512975
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8067039106145252,
"acc_stderr": 0.013206868561343229,
"acc_norm": 0.8067039106145252,
"acc_norm_stderr": 0.013206868561343229
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8464052287581699,
"acc_stderr": 0.020645597910418777,
"acc_norm": 0.8464052287581699,
"acc_norm_stderr": 0.020645597910418777
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8038585209003215,
"acc_stderr": 0.022552447780478033,
"acc_norm": 0.8038585209003215,
"acc_norm_stderr": 0.022552447780478033
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.018877353839571842,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.018877353839571842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6276595744680851,
"acc_stderr": 0.02883892147125145,
"acc_norm": 0.6276595744680851,
"acc_norm_stderr": 0.02883892147125145
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5788787483702738,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.5788787483702738,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8308823529411765,
"acc_stderr": 0.022770868010113014,
"acc_norm": 0.8308823529411765,
"acc_norm_stderr": 0.022770868010113014
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.815359477124183,
"acc_stderr": 0.01569702924075778,
"acc_norm": 0.815359477124183,
"acc_norm_stderr": 0.01569702924075778
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8367346938775511,
"acc_stderr": 0.02366169917709861,
"acc_norm": 0.8367346938775511,
"acc_norm_stderr": 0.02366169917709861
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.02353755765789255,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.02353755765789255
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5336597307221542,
"mc1_stderr": 0.017463793867168106,
"mc2": 0.7005121569261619,
"mc2_stderr": 0.014305944779045657
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.010470796496781086
},
"harness|gsm8k|5": {
"acc": 0.6095526914329037,
"acc_stderr": 0.013437829864668583
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
KaiLv/UDR_Amazon | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: label
dtype: int64
- name: headline
dtype: string
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 13936883
num_examples: 30000
- name: test
num_bytes: 1382953
num_examples: 3000
- name: debug
num_bytes: 2318411
num_examples: 5000
download_size: 11799872
dataset_size: 17638247
---
# Dataset Card for "UDR_Amazon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liyongsea/empty_function_kaggle | ---
dataset_info:
features:
- name: file_id
dtype: string
- name: content
dtype: string
- name: local_path
dtype: string
- name: kaggle_dataset_name
dtype: string
- name: kaggle_dataset_owner
dtype: string
- name: kversion
dtype: string
- name: kversion_datasetsources
dtype: string
- name: dataset_versions
dtype: string
- name: datasets
dtype: string
- name: users
dtype: string
- name: script
dtype: string
- name: df_info
dtype: string
- name: has_data_info
dtype: bool
- name: nb_filenames
dtype: int64
- name: retreived_data_description
dtype: string
- name: script_nb_tokens
dtype: int64
- name: upvotes
dtype: int64
- name: tokens_description
dtype: int64
- name: tokens_script
dtype: int64
splits:
- name: train
num_bytes: 1895686.5998786655
num_examples: 84
download_size: 1763341
dataset_size: 1895686.5998786655
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "empty_function_kaggle"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
spyroot/cornell_sum_movie_dialog | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: movieID
dtype: string
- name: movieTitle
dtype: string
- name: movieYear
dtype: string
- name: movieIMDBRating
dtype: string
- name: movieNoIMDBVotes
dtype: string
- name: movieGenres
sequence: string
- name: utterance
sequence:
- name: lines
dtype: string
- name: lids
dtype: string
splits:
- name: train
num_bytes: 32283731
num_examples: 83097
download_size: 0
dataset_size: 32283731
---
|
hamdan07/UItrasound-Lung | ---
license: bigscience-bloom-rail-1.0
---
|
Cheetor1996/Siesta_zero_no_tsukaima_V2 | ---
license: cc-by-2.0
language:
- en
tags:
- art
---
Siesta from **Zero no Tsukaima/The Familiar of Zero**
- Works the best with ALL, MIDD, OUTD, and OUTALL LoRA weight blocks.
- Use 0.8 -1.0 weights. |
lillybak/sft_dataset_rlaif | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 3126
num_examples: 5
download_size: 6861
dataset_size: 3126
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
one-sec-cv12/chunk_229 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 17427141216.75
num_examples: 181442
download_size: 14290629007
dataset_size: 17427141216.75
---
# Dataset Card for "chunk_229"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
samriffle/Preprocessor-Service | ---
license: cdla-permissive-2.0
---
|
MuGeminorum/AAL_statistics_volumn | ---
license: mit
task_categories:
- image-classification
- feature-extraction
tags:
- biology
- medical
pretty_name: AAL Statistics Volumn
size_categories:
- n<1K
language:
- en
---
# Dataset Card for "MuGeminorum/AAL-statistics-volumn"
The AAL (Automated Anatomical Labeling) Statistics Volumetric dataset provides a comprehensive collection of brain volumetric measurements based on the AAL atlas. It encompasses statistical information on brain regions derived from structural magnetic resonance imaging (MRI) scans. Researchers commonly utilize this dataset for investigations related to neuroimaging, neuroscience, and brain structure analysis. The AAL Statistics Volumetric dataset plays a pivotal role in advancing our understanding of brain anatomy, enabling the development and evaluation of algorithms for automated brain region identification and volumetric analysis. With its wealth of volumetric data derived from diverse individuals, this dataset serves as a valuable resource for studies aimed at characterizing variations in brain structures across populations and contributing to advancements in neuroscientific research.
## Usage
```python
from datasets import load_dataset
data = load_dataset("MuGeminorum/AAL-statistics-volumn", split='train')
for item in data:
print(item)
```
## Maintenance
```bash
git clone git@hf.co:datasets/MuGeminorum/AAL-statistics-volumn
```
## Mirror
<https://www.modelscope.cn/datasets/MuGeminorum/AAL_statistics_volumn>
## Reference
[1] [Chapter II ‐ Classifying AD patients and normal controls from brain images](https://github.com/MuGeminorum/Medical_Image_Computing/wiki/Chapter-II-%E2%80%90-Classifying-AD-patients-and-normal-controls-from-brain-images)
|
open-llm-leaderboard/details_Mihaiii__Pallas-0.3 | ---
pretty_name: Evaluation run of Mihaiii/Pallas-0.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Mihaiii/Pallas-0.3](https://huggingface.co/Mihaiii/Pallas-0.3) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mihaiii__Pallas-0.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-25T01:14:20.652633](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Pallas-0.3/blob/main/results_2023-12-25T01-14-20.652633.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7454130168383197,\n\
\ \"acc_stderr\": 0.0290982917633922,\n \"acc_norm\": 0.7502773723845314,\n\
\ \"acc_norm_stderr\": 0.029647900326113162,\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.5731028245386227,\n\
\ \"mc2_stderr\": 0.015807029979791075\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6194539249146758,\n \"acc_stderr\": 0.014188277712349814,\n\
\ \"acc_norm\": 0.6373720136518771,\n \"acc_norm_stderr\": 0.01404910656495501\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6453893646683927,\n\
\ \"acc_stderr\": 0.004774174590205144,\n \"acc_norm\": 0.8330013941445927,\n\
\ \"acc_norm_stderr\": 0.0037221237096104584\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8618421052631579,\n \"acc_stderr\": 0.028081042939576552,\n\
\ \"acc_norm\": 0.8618421052631579,\n \"acc_norm_stderr\": 0.028081042939576552\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372277,\n\
\ \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372277\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.0289198029561349,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.0289198029561349\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7341040462427746,\n\
\ \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.7341040462427746,\n\
\ \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.774468085106383,\n \"acc_stderr\": 0.027321078417387533,\n\
\ \"acc_norm\": 0.774468085106383,\n \"acc_norm_stderr\": 0.027321078417387533\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n\
\ \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n\
\ \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7034482758620689,\n \"acc_stderr\": 0.03806142687309993,\n\
\ \"acc_norm\": 0.7034482758620689,\n \"acc_norm_stderr\": 0.03806142687309993\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6851851851851852,\n \"acc_stderr\": 0.023919984164047736,\n \"\
acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.023919984164047736\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.896774193548387,\n \"acc_stderr\": 0.017308381281034516,\n\
\ \"acc_norm\": 0.896774193548387,\n \"acc_norm_stderr\": 0.017308381281034516\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.6650246305418719,\n \"acc_stderr\": 0.033208527423483104,\n \"\
acc_norm\": 0.6650246305418719,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\"\
: 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284332,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284332\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9090909090909091,\n \"acc_stderr\": 0.020482086775424218,\n \"\
acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.020482086775424218\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527041,\n\
\ \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527041\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7923076923076923,\n \"acc_stderr\": 0.020567539567246794,\n\
\ \"acc_norm\": 0.7923076923076923,\n \"acc_norm_stderr\": 0.020567539567246794\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.43703703703703706,\n \"acc_stderr\": 0.030242862397654002,\n \
\ \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.030242862397654002\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8277310924369747,\n \"acc_stderr\": 0.024528664971305424,\n\
\ \"acc_norm\": 0.8277310924369747,\n \"acc_norm_stderr\": 0.024528664971305424\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.908256880733945,\n \"acc_stderr\": 0.012376323409137092,\n \"\
acc_norm\": 0.908256880733945,\n \"acc_norm_stderr\": 0.012376323409137092\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6481481481481481,\n \"acc_stderr\": 0.032568505702936464,\n \"\
acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.032568505702936464\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658925,\n \"\
acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658925\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8945147679324894,\n \"acc_stderr\": 0.019995560723758545,\n \
\ \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.019995560723758545\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476074,\n\
\ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476074\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035206,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035206\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243631001,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243631001\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.026321383198783674,\n\
\ \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.026321383198783674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n\
\ \"acc_stderr\": 0.017004368568132366,\n \"acc_norm\": 0.9273504273504274,\n\
\ \"acc_norm_stderr\": 0.017004368568132366\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036623,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036623\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9003831417624522,\n\
\ \"acc_stderr\": 0.010709685591251671,\n \"acc_norm\": 0.9003831417624522,\n\
\ \"acc_norm_stderr\": 0.010709685591251671\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8121387283236994,\n \"acc_stderr\": 0.02102926975242322,\n\
\ \"acc_norm\": 0.8121387283236994,\n \"acc_norm_stderr\": 0.02102926975242322\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6614525139664804,\n\
\ \"acc_stderr\": 0.015826700096481353,\n \"acc_norm\": 0.6614525139664804,\n\
\ \"acc_norm_stderr\": 0.015826700096481353\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8071895424836601,\n \"acc_stderr\": 0.022589318888176693,\n\
\ \"acc_norm\": 0.8071895424836601,\n \"acc_norm_stderr\": 0.022589318888176693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n\
\ \"acc_stderr\": 0.023475581417861106,\n \"acc_norm\": 0.7813504823151125,\n\
\ \"acc_norm_stderr\": 0.023475581417861106\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.019766459563597252,\n\
\ \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.019766459563597252\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6099290780141844,\n \"acc_stderr\": 0.029097675599463933,\n \
\ \"acc_norm\": 0.6099290780141844,\n \"acc_norm_stderr\": 0.029097675599463933\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5801825293350718,\n\
\ \"acc_stderr\": 0.012604960816087364,\n \"acc_norm\": 0.5801825293350718,\n\
\ \"acc_norm_stderr\": 0.012604960816087364\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.02334516361654484,\n\
\ \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.02334516361654484\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7892156862745098,\n \"acc_stderr\": 0.016500472979024794,\n \
\ \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.016500472979024794\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.02412746346265016,\n\
\ \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.02412746346265016\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.031446603773522035,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.031446603773522035\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.03878626771002361,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.03878626771002361\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n\
\ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.5731028245386227,\n\
\ \"mc2_stderr\": 0.015807029979791075\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8066298342541437,\n \"acc_stderr\": 0.011099796645920522\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6027293404094011,\n \
\ \"acc_stderr\": 0.013478659652337799\n }\n}\n```"
repo_url: https://huggingface.co/Mihaiii/Pallas-0.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|arc:challenge|25_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|arc:challenge|25_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|gsm8k|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|gsm8k|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hellaswag|10_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hellaswag|10_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T21-25-41.795563.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-25T01-14-20.652633.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-25T01-14-20.652633.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- '**/details_harness|winogrande|5_2023-12-23T21-25-41.795563.parquet'
- split: 2023_12_25T01_14_20.652633
path:
- '**/details_harness|winogrande|5_2023-12-25T01-14-20.652633.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-25T01-14-20.652633.parquet'
- config_name: results
data_files:
- split: 2023_12_23T21_25_41.795563
path:
- results_2023-12-23T21-25-41.795563.parquet
- split: 2023_12_25T01_14_20.652633
path:
- results_2023-12-25T01-14-20.652633.parquet
- split: latest
path:
- results_2023-12-25T01-14-20.652633.parquet
---
# Dataset Card for Evaluation run of Mihaiii/Pallas-0.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Mihaiii/Pallas-0.3](https://huggingface.co/Mihaiii/Pallas-0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mihaiii__Pallas-0.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-25T01:14:20.652633](https://huggingface.co/datasets/open-llm-leaderboard/details_Mihaiii__Pallas-0.3/blob/main/results_2023-12-25T01-14-20.652633.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7454130168383197,
"acc_stderr": 0.0290982917633922,
"acc_norm": 0.7502773723845314,
"acc_norm_stderr": 0.029647900326113162,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.5731028245386227,
"mc2_stderr": 0.015807029979791075
},
"harness|arc:challenge|25": {
"acc": 0.6194539249146758,
"acc_stderr": 0.014188277712349814,
"acc_norm": 0.6373720136518771,
"acc_norm_stderr": 0.01404910656495501
},
"harness|hellaswag|10": {
"acc": 0.6453893646683927,
"acc_stderr": 0.004774174590205144,
"acc_norm": 0.8330013941445927,
"acc_norm_stderr": 0.0037221237096104584
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8618421052631579,
"acc_stderr": 0.028081042939576552,
"acc_norm": 0.8618421052631579,
"acc_norm_stderr": 0.028081042939576552
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372277,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372277
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.0289198029561349,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.0289198029561349
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387533,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387533
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7034482758620689,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.7034482758620689,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.023919984164047736,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.023919984164047736
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.017308381281034516,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.017308381281034516
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6650246305418719,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.6650246305418719,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284332,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284332
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.020482086775424218,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.020482086775424218
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527041,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527041
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7923076923076923,
"acc_stderr": 0.020567539567246794,
"acc_norm": 0.7923076923076923,
"acc_norm_stderr": 0.020567539567246794
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.030242862397654002,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.030242862397654002
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8277310924369747,
"acc_stderr": 0.024528664971305424,
"acc_norm": 0.8277310924369747,
"acc_norm_stderr": 0.024528664971305424
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.908256880733945,
"acc_stderr": 0.012376323409137092,
"acc_norm": 0.908256880733945,
"acc_norm_stderr": 0.012376323409137092
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.032568505702936464,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.032568505702936464
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658925,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658925
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.019995560723758545,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.019995560723758545
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8244274809160306,
"acc_stderr": 0.03336820338476074,
"acc_norm": 0.8244274809160306,
"acc_norm_stderr": 0.03336820338476074
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035206,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035206
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631001,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631001
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.026321383198783674,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.026321383198783674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.017004368568132366,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.017004368568132366
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9003831417624522,
"acc_stderr": 0.010709685591251671,
"acc_norm": 0.9003831417624522,
"acc_norm_stderr": 0.010709685591251671
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8121387283236994,
"acc_stderr": 0.02102926975242322,
"acc_norm": 0.8121387283236994,
"acc_norm_stderr": 0.02102926975242322
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6614525139664804,
"acc_stderr": 0.015826700096481353,
"acc_norm": 0.6614525139664804,
"acc_norm_stderr": 0.015826700096481353
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8071895424836601,
"acc_stderr": 0.022589318888176693,
"acc_norm": 0.8071895424836601,
"acc_norm_stderr": 0.022589318888176693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7813504823151125,
"acc_stderr": 0.023475581417861106,
"acc_norm": 0.7813504823151125,
"acc_norm_stderr": 0.023475581417861106
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.019766459563597252,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.019766459563597252
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6099290780141844,
"acc_stderr": 0.029097675599463933,
"acc_norm": 0.6099290780141844,
"acc_norm_stderr": 0.029097675599463933
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5801825293350718,
"acc_stderr": 0.012604960816087364,
"acc_norm": 0.5801825293350718,
"acc_norm_stderr": 0.012604960816087364
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8198529411764706,
"acc_stderr": 0.02334516361654484,
"acc_norm": 0.8198529411764706,
"acc_norm_stderr": 0.02334516361654484
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.016500472979024794,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.016500472979024794
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8285714285714286,
"acc_stderr": 0.02412746346265016,
"acc_norm": 0.8285714285714286,
"acc_norm_stderr": 0.02412746346265016
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.031446603773522035,
"acc_norm": 0.89,
"acc_norm_stderr": 0.031446603773522035
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.03878626771002361,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.03878626771002361
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.5731028245386227,
"mc2_stderr": 0.015807029979791075
},
"harness|winogrande|5": {
"acc": 0.8066298342541437,
"acc_stderr": 0.011099796645920522
},
"harness|gsm8k|5": {
"acc": 0.6027293404094011,
"acc_stderr": 0.013478659652337799
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-114000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 640042
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
marmofayezi/M3GenMaskEng | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: mask
dtype: image
- name: caption
dtype: string
- name: generated_image
dtype: image
splits:
- name: train
num_bytes: 2327348093.75
num_examples: 2998
download_size: 1959222030
dataset_size: 2327348093.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
carnival13/test_DA_tokenized2 | ---
dataset_info:
features:
- name: pass_label
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 456736095
num_examples: 335850
download_size: 104506387
dataset_size: 456736095
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test_DA_tokenized2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pharaouk/wikipedia-en | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 20200062385
num_examples: 6407814
download_size: 11623249641
dataset_size: 20200062385
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
louisbrulenaudet/code-forestier-nouveau | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code forestier (nouveau)
source_datasets:
- original
pretty_name: Code forestier (nouveau)
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code forestier (nouveau), non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
Vishal24/tinyllama_review_summary | ---
license: llama2
---
|
GangCaoLab/FISH_spots | ---
license: mit
---
# The manually verified in situ hybridization fluorescence images and point coordinate dataset.
```bash
# Make sure you have git-lfs installed (https://git-lfs.com)
git lfs install
git clone https://huggingface.co/datasets/GangCaoLab/FISH_spots
```
|
renumics/spotlight-laion-dalle-3-dataset-enrichment | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: caption.embedding
sequence: float32
length: 2
- name: link.embedding
sequence: float32
length: 2
- name: message_id.embedding
sequence: float32
length: 2
- name: timestamp.embedding
sequence: float32
length: 2
splits:
- name: train
num_bytes: 47200
num_examples: 1475
download_size: 67788
dataset_size: 47200
---
# Dataset Card for "spotlight-laion-dalle-3-dataset-enrichment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
giobin/crafter_random_walks_random_inventory | ---
dataset_info:
features:
- name: image
dtype: image
- name: obs
dtype: string
- name: id
dtype: int64
- name: episode
dtype: int64
- name: unique_string
dtype: string
splits:
- name: train
num_bytes: 198275610.713
num_examples: 9491
download_size: 229440089
dataset_size: 198275610.713
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_153 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1123763664.0
num_examples: 220692
download_size: 1147744099
dataset_size: 1123763664.0
---
# Dataset Card for "chunk_153"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arbml/FontsLargeSpaced | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 43557017.054
num_examples: 9899
download_size: 28289205
dataset_size: 43557017.054
---
# Dataset Card for "FontsLargeSpaced"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alisson40889/madruguinho | ---
license: openrail
---
|
mekaneeky/SALT-languages-bible | ---
dataset_info:
features:
- name: ach
dtype: string
- name: eng
dtype: string
- name: lgg
dtype: string
- name: lug
dtype: string
- name: nyn
dtype: string
- name: teo
dtype: string
- name: amh
dtype: string
- name: ibo
dtype: string
- name: nyo
dtype: string
- name: xog
dtype: string
- name: swh
dtype: string
- name: verse_key
dtype: string
splits:
- name: train
num_bytes: 46596510
num_examples: 31148
download_size: 24863068
dataset_size: 46596510
---
# Dataset Card for "SALT-languages-bible"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-squad-95d5e1fd-11835579 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad
eval_info:
task: extractive_question_answering
model: deepset/roberta-large-squad2
metrics: []
dataset_name: squad
dataset_config: plain_text
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: deepset/roberta-large-squad2
* Dataset: squad
* Config: plain_text
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mbartolo ](https://huggingface.co/mbartolo ) for evaluating this model. |
autoevaluate/autoeval-eval-samsum-samsum-417ba9-2386774737 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: NYTK/summarization-hi-bart-base-1024-hungarian
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: test
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: NYTK/summarization-hi-bart-base-1024-hungarian
* Dataset: samsum
* Config: samsum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
maghwa/OpenHermes-2-AR-10K-6 | ---
dataset_info:
features:
- name: avatarUrl
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: views
dtype: float64
- name: source
dtype: string
- name: model
dtype: 'null'
- name: model_name
dtype: 'null'
- name: conversations
dtype: string
- name: category
dtype: 'null'
- name: language
dtype: 'null'
- name: topic
dtype: 'null'
- name: id
dtype: string
- name: hash
dtype: 'null'
- name: idx
dtype: 'null'
- name: title
dtype: 'null'
- name: skip_prompt_formatting
dtype: 'null'
- name: system_prompt
dtype: 'null'
splits:
- name: train
num_bytes: 25330097
num_examples: 10001
download_size: 9644399
dataset_size: 25330097
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_mnli_you_ye | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 333732
num_examples: 1559
- name: dev_mismatched
num_bytes: 250284
num_examples: 1378
- name: test_matched
num_bytes: 335472
num_examples: 1545
- name: test_mismatched
num_bytes: 242963
num_examples: 1321
- name: train
num_bytes: 13815206
num_examples: 63320
download_size: 9129370
dataset_size: 14977657
---
# Dataset Card for "MULTI_VALUE_mnli_you_ye"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jotschi/coco-karpathy-simple-en | ---
language:
- en
license_name: cc-by-4.0
license_link: https://creativecommons.org/licenses/by/4.0/legalcode
tags:
- coco
- mscoco
- simple-english
annotations_creators:
- machine-generated
pretty_name: MS COCO Karpathy in Simple English
size_categories:
- n<650k
source_datasets:
- mscoco
task_categories:
- text-generation
- image-to-text
- text-to-image
---
# Dataset Card for MS COCO Karpathy in Simple English
This dataset contains captions that were rephrased into simple english so that a young child would understand it.
## Dataset Details
### Dataset Description
- **Curated by:** {{ curators | default("[More Information Needed]", true)}}
- **Language(s) (NLP):** {{ language | default("[More Information Needed]", true)}}
- **License:** {{ license | default("[More Information Needed]", true)}}
### Dataset Sources
The processed [MS COCO datasets](https://cocodataset.org/#download) (Karpathy Split) in this repo are based on the following sources:
| Type | MD5 | URL |
|------------|----------------------------------|-----------------------------------------------------------------------------------------------|
| Train | aa31ac474cf6250ebb81d18348a07ed8 | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_train.json |
| Validation | b273847456ef5580e33713b1f7de52a0 | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_val.json |
| Test | 3ff34b0ef2db02d01c37399f6a2a6cd1 | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_test.json |
MS COCO:
- **Download:** https://cocodataset.org/#download
- **Paper:** http://arxiv.org/abs/1405.0312
## Dataset Creation
This dataset was generated by processing the annotations via [Mistal7B](https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.2-AWQ).
Prompt used:
```
Rewrite the sentence " + caption + " for a 3 to 4 year old child. Give only one simple sentence. Don't use the word see. Give only a single answer.
```
A filter was applied to only store captions which matched the common output format. A best effort filter was applied to reduce the chance of including multiple example sentences in the output.
### Curation Rationale
This dataset is useful for experiments with small LLMs which have only a reduced corpus. The dataset is suitable to be used for LAVIS experiments (QFormer Training) with a finetuned TinyStories 33M LLM.
|
liuyanchen1015/MULTI_VALUE_mnli_say_complementizer | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 134276
num_examples: 573
- name: dev_mismatched
num_bytes: 129393
num_examples: 538
- name: test_matched
num_bytes: 145704
num_examples: 584
- name: test_mismatched
num_bytes: 129452
num_examples: 517
- name: train
num_bytes: 5622384
num_examples: 23545
download_size: 3757829
dataset_size: 6161209
---
# Dataset Card for "MULTI_VALUE_mnli_say_complementizer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hieule/news_corpus_v2_p2 | ---
dataset_info:
features:
- name: source
dtype: string
- name: title
dtype: string
- name: sapo
dtype: string
- name: cates
dtype: 'null'
- name: publish
dtype: timestamp[us]
- name: text_content
dtype: string
splits:
- name: train
num_bytes: 18238489193
num_examples: 5000000
download_size: 9130800517
dataset_size: 18238489193
---
# Dataset Card for "news_corpus_v2_p2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mattyhatch/tomatoesCWSI | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 1034721.0
num_examples: 6
download_size: 134150
dataset_size: 1034721.0
---
# Dataset Card for "tomatoesCWSI"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mush42/piper-rt | ---
license: mit
---
|
Emanuse/greenwashing | ---
license: mit
---
|
TKKG/inferno | ---
license: afl-3.0
---
|
HydraLM/partitioned_v2_standardized_01 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
splits:
- name: train
num_bytes: 29375204.585235614
num_examples: 57468
download_size: 22913184
dataset_size: 29375204.585235614
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v2_standardized_01"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Alejandrapulidoa/stocks | ---
language:
- en
tags:
- finance
---
|
bot-yaya/un_pdf_random9208_preprocessed_2 | ---
dataset_info:
features:
- name: zh
dtype: string
- name: en
dtype: string
- name: fr
dtype: string
- name: es
dtype: string
- name: ru
dtype: string
- name: record
dtype: string
splits:
- name: train
num_bytes: 1741751686
num_examples: 9208
download_size: 830077813
dataset_size: 1741751686
---
# Dataset Card for "un_pdf_random9208_preprocessed_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Helsinki-NLP/opus_montenegrinsubs | ---
annotations_creators:
- found
language_creators:
- found
language:
- cnr
- en
license:
- unknown
multilinguality:
- translation
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- translation
task_ids: []
pretty_name: OpusMontenegrinsubs
dataset_info:
config_name: en-me
features:
- name: translation
dtype:
translation:
languages:
- en
- me
splits:
- name: train
num_bytes: 4896347
num_examples: 65043
download_size: 3376459
dataset_size: 4896347
configs:
- config_name: en-me
data_files:
- split: train
path: en-me/train-*
---
# Dataset Card for [opus_montenegrinsubs]
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**[opus MontenegrinSubs ](http://opus.nlpl.eu/MontenegrinSubs.php)
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Opus MontenegrinSubs dataset for machine translation task, for language pair en-me: english and montenegrin
### Supported Tasks and Leaderboards
The underlying task is machine translation from en to me
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
J. Tiedemann, 2012, Parallel Data, Tools and Interfaces in OPUS. In Proceedings of the 8th International Conference on Language Resources and Evaluation (LREC 2012)
### Contributions
Thanks to [@spatil6](https://github.com/spatil6) for adding this dataset. |
DJMOON/RC-49_128x128_diffusion | ---
dataset_info:
features:
- name: image
dtype: image
- name: angle
dtype: float64
- name: ratio
dtype: float64
- name: type
dtype: float64
splits:
- name: train
num_bytes: 1062717643.09
num_examples: 264894
download_size: 1203438100
dataset_size: 1062717643.09
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Eddycrack864/Music-Dataset | ---
license: openrail
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/c21a7a19 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1336
dataset_size: 188
---
# Dataset Card for "c21a7a19"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hxgrace/syntheticHands269 | ---
dataset_info:
features:
- name: frames
sequence: image
- name: sketch
dtype: image
- name: label
dtype: string
splits:
- name: train
num_bytes: 507943100.0
num_examples: 269
download_size: 3596173
dataset_size: 507943100.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
The "frames" column of this dataset holds a sequence of images, where each sequence represents one video. Each video depicts a virtual hand, animated in Unity, tracing out sketches from the [Quick, Draw!](https://quickdraw.withgoogle.com) dataset. The "sketch" column shows the original sketch, and the "label" column is the prompt given to the artist before each sketch was drawn. |
bossmomo/Jack | ---
license: apache-2.0
language:
- th
tags:
- art
- code
pretty_name: Thai sum
size_categories:
- 10M<n<100M
---
from datasets import load_dataset
dataset = load_dataset("Bossmomoga/Thaidt") |
hlillemark/flores200_devtest_mt5-3b-flores200-scaffold | ---
dataset_info:
features:
- name: id
dtype: int32
- name: source_lang
dtype: string
- name: target_lang
dtype: string
- name: source
dtype: string
- name: target
dtype: string
- name: prediction
dtype: string
- name: chrf_unreduced
dtype: string
splits:
- name: devtest
num_bytes: 372332916
num_examples: 500000
download_size: 261837967
dataset_size: 372332916
---
# Dataset Card for "flores200_devtest_mt5-3b-flores200-scaffold"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibragim-bad/arcc_multilang | ---
dataset_info:
- config_name: ar
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: train
num_bytes: 515979
num_examples: 1117
- name: validation
num_bytes: 146393
num_examples: 298
- name: test
num_bytes: 555344
num_examples: 1169
download_size: 559228
dataset_size: 1217716
- config_name: de
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: train
num_bytes: 416218
num_examples: 1116
- name: validation
num_bytes: 116268
num_examples: 298
- name: test
num_bytes: 445928
num_examples: 1169
download_size: 513244
dataset_size: 978414
- config_name: es
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: train
num_bytes: 415815
num_examples: 1118
- name: validation
num_bytes: 116298
num_examples: 297
- name: test
num_bytes: 444815
num_examples: 1170
download_size: 499409
dataset_size: 976928
- config_name: fr
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: train
num_bytes: 431884
num_examples: 1118
- name: validation
num_bytes: 121206
num_examples: 298
- name: test
num_bytes: 460727
num_examples: 1169
download_size: 519321
dataset_size: 1013817
- config_name: he
features:
- name: index
dtype: int64
- name: ind
dtype: int64
- name: question
dtype: string
- name: choices
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: id
dtype: string
- name: answerKey
dtype: string
- name: split
dtype: string
splits:
- name: validation
num_bytes: 116970
num_examples: 270
download_size: 60796
dataset_size: 116970
- config_name: it
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: train
num_bytes: 411526
num_examples: 1118
- name: validation
num_bytes: 114977
num_examples: 297
- name: test
num_bytes: 439356
num_examples: 1169
download_size: 506239
dataset_size: 965859
- config_name: ru
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: train
num_bytes: 617514
num_examples: 1118
- name: validation
num_bytes: 171795
num_examples: 297
- name: test
num_bytes: 660294
num_examples: 1169
download_size: 669039
dataset_size: 1449603
configs:
- config_name: ar
data_files:
- split: train
path: ar/train-*
- split: validation
path: ar/validation-*
- split: test
path: ar/test-*
- config_name: de
data_files:
- split: train
path: de/train-*
- split: validation
path: de/validation-*
- split: test
path: de/test-*
- config_name: es
data_files:
- split: train
path: es/train-*
- split: validation
path: es/validation-*
- split: test
path: es/test-*
- config_name: fr
data_files:
- split: train
path: fr/train-*
- split: validation
path: fr/validation-*
- split: test
path: fr/test-*
- config_name: he
data_files:
- split: validation
path: he/validation-*
- config_name: it
data_files:
- split: train
path: it/train-*
- split: validation
path: it/validation-*
- split: test
path: it/test-*
- config_name: ru
data_files:
- split: train
path: ru/train-*
- split: validation
path: ru/validation-*
- split: test
path: ru/test-*
---
|
surafelkindu/Amharic_corpus | ---
license: mit
---
ዛጎል ዜና- መንግስት አምስት ሺህ የሚጠጉ እስረኞችን “ተመራቂዎች” በሚል መፍታቱን ይፋ ባደረገበት ቀን በተመሳሳይ አምቦ ተማሪዎች ተቃውሞ ማሰማታቸው ተሰማ። ተማሪዎቹ የአስቸኳይ አዋጁን በመጣስ ” መረራ ይፈታ” እያሉ ተቃውሞ መጀመራቸው ነው የተሰማው። ከትምህርት ቤት ወደ ትምህርት ቤት የሰፋው ተቃውሞ ብህይወት ላይ አደጋ ባያስከትልም በንብረት ላይ ግን ጉዳት አድርሷል። መኪና ሲቃጠል ያዩ የአይን ምስክሮች ተቃውሞውን በጀመሩት ላይም ሆነ ዘግይተው በተቀላቀሉት ላይ እንደ ቀደሞው ያለ የሃይል እርምጃ አልተወሰደም። የኦሮሚያ ሚዲያ ኔት ወርክ እንዳለው ደግሞ በርካታ ሰዎች ታስረዋል።
ለወትሮው ህገ መንግስቱን በሃይል ለመናድ የተነሱ፣ የነውጥ ሃይሎች፣ አተራማሾች፣ የጥፋት ሃይል ተላላኪዎች በሚል ተጠርጥረው በቁጥጥር ስር ከዋሉት መካከል 4035 የሚሆኑት ሲፈቱ እስረኞቹ “ስድስት ኮርስ ወስደው ተመረቁ” ነው የተባለው።
የኦሮሚያ ማረሚያ ቤቶች አስተዳደር ኮሚሽነር ፀሃይ በላይን ጠቅሶ ፋና እንደዘገበው ጦላይ ተሃድሶ ማዕከል ከገቡ 5 ሺህ 600 ሰልጣኞች መካከል 4035 ያህሉ በስድስት ዋና ዋና ጉዳዮች ሥልጠና ወስደው ተመርቀዋል። ኮርሶቹም በፍፁም፣ አይደገምም፣ የቀለም አብዮት፣ የኢትዮጰያ ህገ–መንግስት እና የኢትዮጵያ ህዳሴ የሚሉ ርዕሰ ጉዳዮችን የተካተቱባቸው ነው።
አበምርቃቱ ላይ ጠቅላይ ሚኒስትር ሃይለማርያም ተገኝተው “ ሽኝት” አደርጉላቸው ተብሏል። በርካታ ቃል ተገብቶላቸዋል። መስመርም ተሰምሮላቸዋል። “በደምና በአጥንት የተጻፈውን ሕገመንግስት፣ ዋጋ የተከፈለበትን ህገመንግስት” በማለት አቶ ሃይለማርያም በሃይል ለመናድ መሞከር አይቻልም በለዋል። “ ልክ እናንተ አይደገምም እንዳላችሁት፣ እኛም አይደገም እንላለን” ብለዋል። የፋና ዘገባ እንዲህ ይነበባል።
አዲስ አበባ ፣ ታህሳስ 12 ፣ 2009 (ኤፍ ቢ ሲ) በሃገሪቱ የተለያዩ አካባቢዎች በተፈጠረው ሁከት ውስጥ ተሳትፈው በማሰልጠኛ ጣቢያዎች የተሃድሶ ስልጠና ሲወስዱ የነበሩ ዜጎች ወደ መጡበት እየተመለሱ ነው። በአዋሽ፣ አላጌና ብር ሸለቆ ማዕከላት የተሃድሶ ስልጠና የወሰዱ ዜጎች ናቸው ወደ አካባቢያቸው እየተመለሱ ያሉት። በጦላይ ለአንድ ወር የተሃድሶ ስልጠና የወሰዱ 4 ሺህ 35 ዜጎችም ሥልጠናቸውን አጠናቀው ነገ ወደ መጡበት አካባቢ ይመለሳሉ ተብሏል።
በጦላይ የተሃድሶ ማዕከል የተገኙት ጠቅላይ ሚኒስትር ኃይለማርያም ደሳለኝ በዚሁ ጊዜ ባስተላለፉት መልዕክት ሰልጣኞች ወደ መደበኛ ህይወታቸው እንዲመለሱ መንግሥት ድጋፍ ያደርጋል ብለዋል። ሠራተኞች ወደ ሥራ ገበታቸው እንዲመለሱ የሚደረግ ሲሆን ተማሪዎች ደግሞ ትምህርታቸው እንዲቀጥሉ ይደረጋልም ነው ያሉት ጠቅላይ ሚኒስትር ኃይለማርያም።
ሥራ አጥ የሆኑ ወጣቶችም በራሳቸው መንገድ ሥራ እንዲፈጥሩ ድጋፍ እንደሚደረግላቸው ጠቅላይ ሚኒስትሩ ገልጸዋል። ሠላም፣ ልማትና ዴሞክራሲ የማይነጣጡ የአንድ አገር ህልውና መሰረት መሆናቸውን ወጣቱ ተገንዝቦ እነዚህን እሴቶች የመጠበቅ ኃላፊነቱን እንዲወጣ ጠይቀዋል። ወጣቱ ጥያቄ እንኳ ቢኖረው ሕገ-መንግሥቱ በሚፈቅደው መሰረት የማቅረብና መልስ የማግኘት መብት እንዳለው ገልጸዋል። ባለፉት ወራት እንደታየው ጥያቄውን በአመጽና ግርግር መጠየቁ ዋጋ እንዳስከፈለ ለማሳያነት በማንሳት።
እንዲህ ዓይነት ሁኔታ እንዳይደገም መንግሥትም የራሱን ስህተት ለማረም ጥልቅ ተሃድሶ እያደረገ መሆኑን ገልጸው ወጣቱም የራሱን ስህተት በማረም ከመንግሥት ጋር በመሆን ሠላሙን እንዲጠብቅ መልዕክት አስተላልፈዋል። የኦሮሚያ ክልል ርዕሰ መስተዳደር አቶ ለማ መገርሳ በበኩላቸው በክልሉ የሰፈነውን ሠላም ለማስቀጠል ከሁሉም የህብረተሰብ ክፍል ጋር በቅንጅት ሥራዎች ይሰራሉ ብለዋል።
ከወራት በፊት በተፈጠረው ሁከትና ግርግር ህይወት የጠፋ መሆኑን ገልጸው ለዘመናት የተለፋባቸው የህዝብ ኃብቶችም መውደማቸው አግባብ አለመሆኑን ተናግረዋል። ክልሉ ሊለወጥና ሊለማ የሚችለው የክልሉ ወጣቶች ለሠላም በጋራ ዘብ ሲቆሙ እንደሆነም አስምረውበታል።
አሁን ወደ |
ShuoChen99/adarobustness | ---
license: mit
---
|
leomiranda02/minhavoz98 | ---
license: openrail
---
|
mesolitica/synthetic-multiturn-multimodal | ---
license: mit
language:
- ms
- en
---
# Multiturn Multimodal
We want to generate synthetic data that able to understand position and relationship between multi-images and multi-audio, example as below,
All notebooks at https://github.com/mesolitica/malaysian-dataset/tree/master/chatbot/multiturn-multimodal
## multi-images
- [synthetic-multi-images-relationship.jsonl](synthetic-multi-images-relationship.jsonl), 100000 rows, 109MB.
- Images at https://huggingface.co/datasets/mesolitica/translated-LLaVA-Pretrain/tree/main
### Example data
```python
{'filename': ['00370/003705168.jpg', '00611/006111738.jpg'],
'filename_description': ['paco man edp gift set',
"raspberry lemonade is a must in new york's famous cocktail scene"],
'instruction': 'What is related between picture 1 and picture 2',
'answer': "There is no direct relation between Picture 1 (Paco Rabanne 1 Million Edp Gift Set) and Picture 2 (Raspberry Lemonade being a must in New York's famous cocktail scene). Both are unrelated to each other. The first picture is an image of a perfume gift set, while the second picture is a depiction of a popular cocktail in New York City.",
'instruction_ms': 'Apakah yang berkaitan antara gambar 1 dan gambar 2',
'answer_ms': 'Tiada hubungan langsung antara Gambar 1 (Paco Rabanne 1 Million Edp Gift Set) dan Gambar 2 (Raspberry Lemonade menjadi must dalam adegan koktel terkenal di New York). Kedua-duanya tidak berkaitan antara satu sama lain. Gambar pertama ialah imej set hadiah minyak wangi, manakala gambar kedua ialah gambaran koktel popular di New York City.'}
```
## multi-images multi-audio
- [synthetic-multi-images-multi-audio-relationship.jsonl](synthetic-multi-images-multi-audio-relationship.jsonl), 59400 rows, 96.6 MB.
- Images at https://huggingface.co/datasets/mesolitica/translated-LLaVA-Pretrain/tree/main
- Audio from https://huggingface.co/datasets/mesolitica/malaysian-youtube-audio-instructions/tree/main
### Example data
```python
{'filename': ['output-audio/3-2648-47.mp3', '00180/001805101.jpg'],
'filename_description': ['Saya mahu muka mereka terlihat beras, anda tahu apa yang saya maksudkan. Dan sanitizer. Dan kemudian ini adalah earphone. Sama-sama kalau airpod saya, anda tahu, hilang bateri. Saya tidak pasti jika saya patut membawa tripod saya kerana saya mungkin. Adakah saya akan melakukan TikTok di kafe? Saya tidak tahu tetapi tidak menyakiti untuk membawanya. Maksud saya, ia tidak begitu keras. Saya perlu membawa krim tangan saya. Dan kemudian bumbu. Dan lip balm. Dan kemudian kita siap untuk pergi.',
'a water wheel with moss growing on the wheels metal print by randall white'],
'instruction': 'What is related between audio 1 and picture 1',
'answer': "The audio and picture do not have a direct relation to each other. The audio is about preparing items for an outing, including sanitizer, earphones, a tripod, and various other personal items. The picture is a print of a water wheel with moss growing on it by Randall White. There is no connection between the audio's content and the picture's subject matter.",
'instruction_ms': 'Apakah yang berkaitan antara audio 1 dan gambar 1',
'answer_ms': 'Audio dan gambar tidak mempunyai hubungan langsung antara satu sama lain. Audio adalah mengenai penyediaan item untuk keluar, termasuk pembersih, fon telinga, tripod dan pelbagai barangan peribadi lain. Gambar itu ialah cetakan roda air dengan lumut yang tumbuh di atasnya oleh Randall White. Tiada kaitan antara kandungan audio dan subjek gambar.'}
```
## multi-audio
- [synthetic-multi-images-multi-audio-relationship.jsonl](synthetic-multi-images-multi-audio-relationship.jsonl), 25100 rows, 65.1 MB.
- Audio from https://huggingface.co/datasets/mesolitica/malaysian-youtube-audio-instructions/tree/main
```python
{'filename': ['output-audio/3-2080-38.mp3', 'output-audio/0-2823-0.mp3'],
'filename_description': ['Terima kasih Menteri. Saya jemput soalan tambahan yang kedua. Bagan Serai. Terima kasih Tuan Speaker. Berapakah jumlah kemalangan yang menyebabkan kematian disebabkan oleh pengaruh handphone, penggunaan handphone semasa mandu. Kerana guna handphone mandu ini dia macam mabuk lebih Tuan Speaker. Dan dia hilang orientasi. Dia tak tahu di mana traffic light, dia tak tahu dia di mana berada dan tiba-tiba dah sampai. Jadi apa kerajaan nak buat untuk menurunkan tabiat buruk menggunakan handphone semasa mandu.',
'dalam video tu saya dah kitamkan kening lah sebab benda tu kita mencuba so at least kita dah mencuba kita kan nak mencuba kan masa ni lah mencuba kan janganlah pula usia macam aku dah 50 pun nak cuba kenapa masa buat lagu raya cover tu tak boleh hijau sebab dia nak image ketupat macam Aina Abdul juga dia ketupat kita bawa image rambut tu warna hijau ketupat juga kan tapi dah habis raya after this memang nak reveal jugalah kan habis ni memang saya akan kekalkan image yang very very formal je lah'],
'instruction': 'What is related between audio 1 and audio 2',
'answer': 'Audio 1 and Audio 2 are unrelated as they discuss different topics. In Audio 1, the speaker is discussing the issue of using handphones while driving and its contribution to accidents. In Audio 2, the speaker is talking about making a cover song for Raya and the challenges they faced in creating the image for the video.',
'instruction_ms': 'Apakah yang berkaitan antara audio 1 dan audio 2',
'answer_ms': 'Audio 1 dan Audio 2 tidak berkaitan kerana mereka membincangkan topik yang berbeza. Dalam Audio 1, penceramah membincangkan isu menggunakan fon tangan semasa memandu dan sumbangannya kepada kemalangan. Dalam Audio 2, penceramah bercakap tentang membuat lagu penutup untuk Raya dan cabaran yang mereka hadapi dalam mencipta imej untuk video itu.'}
``` |
jonathanasdf/MathGLM-dataset-5M | ---
license: afl-3.0
---
Every 10th row from https://github.com/THUDM/MathGLM (original dataset has 50M entries) |
vannarathp/segmented-openkp | ---
license: mit
---
|
PartiallyTyped/answerable_tydiqa | ---
dataset_info:
features:
- name: question_text
dtype: string
- name: document_title
dtype: string
- name: language
dtype: string
- name: annotations
struct:
- name: answer_start
sequence: int64
- name: answer_text
sequence: string
- name: document_plaintext
dtype: string
- name: document_url
dtype: string
splits:
- name: train
num_bytes: 32084629.326371837
num_examples: 29868
- name: validation
num_bytes: 3778385.324427767
num_examples: 3712
download_size: 16354337
dataset_size: 35863014.6507996
---
# Dataset Card for "answerable_tydiqa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jtatman/python-code-dataset-500k | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: system
dtype: string
splits:
- name: train
num_bytes: 922266591
num_examples: 559515
download_size: 346944286
dataset_size: 922266591
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- text-generation
tags:
- instructional
- python
- code
pretty_name: github_python
size_categories:
- 100K<n<1M
---
#### Attention: This dataset is a summary and reformat pulled from github code.
You should make your own assumptions based on this.
In fact, there is another dataset I formed through parsing that addresses several points:
- out of 500k python related items, most of them are python-ish, not pythonic
- the majority of the items here contain excessive licensing inclusion of original code
- the items here are sometimes not even python but have references
- There's a whole lot of gpl summaries floating on the code responses or instructions
As such, you are probably not getting good data to begin with, but this should be used as a starting point at best.
You have been warned.
|
CausalLM/GPT-4-Self-Instruct-Turkish | ---
license: cc-by-4.0
language:
- tr
tags:
- gpt4
---
**Sorry, it's no longer available on Hugging Face. Please reach out to those who have already downloaded it. If you have a copy, please refrain from re-uploading it to Hugging Face. The people here don't deserve it. See also: https://twitter.com/RealJosephus/status/1779913520529707387**
As per [the community's request](https://huggingface.co/datasets/CausalLM/GPT-4-Self-Instruct-German/discussions/1), here we share a Turkish dataset synthesized using the OpenAI GPT-4 model with Self-Instruct, utilizing some excess Azure credits. Please feel free to use it. All questions and answers are newly generated by GPT-4, without specialized verification, only simple filtering and strict semantic similarity control have been applied.
We hope that this will be helpful for fine-tuning open-source models for non-English languages, particularly Turkish. This dataset will be updated continuously. |
NomeIncrivel/Febatista | ---
license: openrail
---
|
LM63/LM2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1651725
num_examples: 1814
download_size: 271902
dataset_size: 1651725
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
dariolopez/Llama-2-databricks-dolly-es | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: es
num_bytes: 13756271
num_examples: 15015
download_size: 8012116
dataset_size: 13756271
configs:
- config_name: default
data_files:
- split: es
path: data/es-*
language:
- es
size_categories:
- 10K<n<100K
---
# Databricks Dolly 15k (es) for Llama-2
## Dataset Summary
The Databricks Dolly 15k dataset source is curated in multiple languages by [Argilla](https://huggingface.co/datasets/argilla/databricks-dolly-15k-curated-multilingual).
* Filtered by `lang=es`.
* Formatted according to the Llama-2 pattern: "\<s> [INST] user prompt [/INST] output model \</s>"
## Dataset Structure
The dataset has 3909 rows of tuples (instructions and outputs). |
rachittshah/alpaca-marahti | ---
language:
- en
license: mit
---
# Marathi Translation of Alpaca Dataset
## Description
This dataset is a Marathi translation of the Alpaca dataset originally found at [tatsu-lab/alpaca](https://huggingface.co/datasets/tatsu-lab/alpaca). It has been translated to aid in the instruction and fine-tuning of Large Language Models (LLMs) for better understanding and generation of Marathi text.
## Languages
The dataset is primarily in Marathi (mr-IN).
## Dataset Structure
### Data Instances
A typical data instance comprises the original text in the source language and its corresponding translation in Marathi.
```json
{
"source_text": "Original text in source language.",
"translated_text": "अनुवादित मजकूर मराठी भाषेत."
}```
### Data Fields
- `source_text`: the text in the original language from the Alpaca dataset.
- `translated_text`: the translated text in Marathi. |
pvduy/ultra-mix-13k | ---
dataset_info:
features:
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 71009425
num_examples: 13565
- name: test
num_bytes: 448467
num_examples: 100
download_size: 34974157
dataset_size: 71457892
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
zhusdika/phone_calls | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 20598597.0
num_examples: 2
- name: test
num_bytes: 4921255.0
num_examples: 1
download_size: 22337287
dataset_size: 25519852.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Xylverize/p2m1 | ---
license: other
---
|
Cheetor1996/Saki_Viper | ---
license: cc-by-2.0
language:
- en
tags:
- art
---
Saki from the Viper game series
- Trained with Anime (final-full-pruned) model
- Works best with ALL, MIDD, and OUTALL LoRA weight blocks
- Recommended LoRa weights; 0.7+ |
joey234/mmlu-business_ethics-original-neg | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
splits:
- name: test
num_bytes: 3988.8
num_examples: 12
download_size: 6402
dataset_size: 3988.8
---
# Dataset Card for "mmlu-business_ethics-original-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/VQAv2_validation_facebook_opt_2.7b_mode_VQAv2_visclues_detection_ns_10_open_ended | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_bs_32
num_bytes: 1769
num_examples: 10
download_size: 0
dataset_size: 1769
---
# Dataset Card for "VQAv2_validation_facebook_opt_2.7b_mode_VQAv2_visclues_detection_ns_10_open_ended"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ubermenchh/riddles_extended | ---
dataset_info:
features:
- name: number
dtype: int64
- name: messages
sequence: string
splits:
- name: train
num_bytes: 2318414
num_examples: 1746
download_size: 1258319
dataset_size: 2318414
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_wenge-research__yayi-70b-llama2 | ---
pretty_name: Evaluation run of wenge-research/yayi-70b-llama2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wenge-research/yayi-70b-llama2](https://huggingface.co/wenge-research/yayi-70b-llama2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wenge-research__yayi-70b-llama2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T20:08:14.965059](https://huggingface.co/datasets/open-llm-leaderboard/details_wenge-research__yayi-70b-llama2/blob/main/results_2023-09-13T20-08-14.965059.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6427362614871128,\n\
\ \"acc_stderr\": 0.03251742836753478,\n \"acc_norm\": 0.6468766983428953,\n\
\ \"acc_norm_stderr\": 0.032494548846313066,\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.016132229728155045,\n \"mc2\": 0.4762734947955207,\n\
\ \"mc2_stderr\": 0.01439837288557781\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5614334470989761,\n \"acc_stderr\": 0.014500682618212862,\n\
\ \"acc_norm\": 0.606655290102389,\n \"acc_norm_stderr\": 0.014275101465693026\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.640211113324039,\n\
\ \"acc_stderr\": 0.0047895751634186535,\n \"acc_norm\": 0.8392750448117905,\n\
\ \"acc_norm_stderr\": 0.00366526456385775\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6170212765957447,\n \"acc_stderr\": 0.03177821250236922,\n\
\ \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.04644602091222318,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.04644602091222318\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469536,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469536\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289708,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289708\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.04026141497634612,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.04026141497634612\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200154,\n \"\
acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200154\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944853,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944853\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7354260089686099,\n\
\ \"acc_stderr\": 0.02960510321703832,\n \"acc_norm\": 0.7354260089686099,\n\
\ \"acc_norm_stderr\": 0.02960510321703832\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.0309227883204458,\n \"acc_norm\"\
: 0.8677685950413223,\n \"acc_norm_stderr\": 0.0309227883204458\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.038935425188248475,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.038935425188248475\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.01374079725857982,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.01374079725857982\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n\
\ \"acc_stderr\": 0.014912413096372434,\n \"acc_norm\": 0.2737430167597765,\n\
\ \"acc_norm_stderr\": 0.014912413096372434\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.0246596851859673,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.0246596851859673\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5110821382007823,\n\
\ \"acc_stderr\": 0.012767098998525826,\n \"acc_norm\": 0.5110821382007823,\n\
\ \"acc_norm_stderr\": 0.012767098998525826\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003476,\n\
\ \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003476\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083376,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083376\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.027049257915896175,\n\
\ \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.027049257915896175\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070803,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070803\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.016132229728155045,\n \"mc2\": 0.4762734947955207,\n\
\ \"mc2_stderr\": 0.01439837288557781\n }\n}\n```"
repo_url: https://huggingface.co/wenge-research/yayi-70b-llama2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|arc:challenge|25_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hellaswag|10_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T20-08-14.965059.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T20-08-14.965059.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T20-08-14.965059.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T20-08-14.965059.parquet'
- config_name: results
data_files:
- split: 2023_09_13T20_08_14.965059
path:
- results_2023-09-13T20-08-14.965059.parquet
- split: latest
path:
- results_2023-09-13T20-08-14.965059.parquet
---
# Dataset Card for Evaluation run of wenge-research/yayi-70b-llama2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wenge-research/yayi-70b-llama2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [wenge-research/yayi-70b-llama2](https://huggingface.co/wenge-research/yayi-70b-llama2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wenge-research__yayi-70b-llama2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T20:08:14.965059](https://huggingface.co/datasets/open-llm-leaderboard/details_wenge-research__yayi-70b-llama2/blob/main/results_2023-09-13T20-08-14.965059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6427362614871128,
"acc_stderr": 0.03251742836753478,
"acc_norm": 0.6468766983428953,
"acc_norm_stderr": 0.032494548846313066,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155045,
"mc2": 0.4762734947955207,
"mc2_stderr": 0.01439837288557781
},
"harness|arc:challenge|25": {
"acc": 0.5614334470989761,
"acc_stderr": 0.014500682618212862,
"acc_norm": 0.606655290102389,
"acc_norm_stderr": 0.014275101465693026
},
"harness|hellaswag|10": {
"acc": 0.640211113324039,
"acc_stderr": 0.0047895751634186535,
"acc_norm": 0.8392750448117905,
"acc_norm_stderr": 0.00366526456385775
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04644602091222318,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04644602091222318
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469536,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469536
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726854,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726854
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289708,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289708
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.04026141497634612,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.04026141497634612
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.016722684526200154,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.016722684526200154
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944853,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944853
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7354260089686099,
"acc_stderr": 0.02960510321703832,
"acc_norm": 0.7354260089686099,
"acc_norm_stderr": 0.02960510321703832
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.0309227883204458,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.0309227883204458
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.038935425188248475,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.038935425188248475
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.01374079725857982,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.01374079725857982
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372434,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372434
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7363344051446945,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.7363344051446945,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.0246596851859673,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.0246596851859673
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5110821382007823,
"acc_stderr": 0.012767098998525826,
"acc_norm": 0.5110821382007823,
"acc_norm_stderr": 0.012767098998525826
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.030008562845003476,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.030008562845003476
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083376,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083376
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7673469387755102,
"acc_stderr": 0.027049257915896175,
"acc_norm": 0.7673469387755102,
"acc_norm_stderr": 0.027049257915896175
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070803,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070803
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155045,
"mc2": 0.4762734947955207,
"mc2_stderr": 0.01439837288557781
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
maidalun1020/MMarcoRerankingEn2Zh | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
dataset_info:
features:
- name: query
dtype: string
- name: positive
sequence: string
- name: negative
sequence: string
splits:
- name: dev
num_bytes: 7257910
num_examples: 269
download_size: 4156414
dataset_size: 7257910
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_272 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 767228868.0
num_examples: 149499
download_size: 785670623
dataset_size: 767228868.0
---
# Dataset Card for "chunk_272"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_sst2_who_which | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 4995
num_examples: 31
- name: test
num_bytes: 11988
num_examples: 69
- name: train
num_bytes: 149971
num_examples: 1021
download_size: 76989
dataset_size: 166954
---
# Dataset Card for "MULTI_VALUE_sst2_who_which"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/novel_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of novel/ノベル/诺薇儿/노벨 (Nikke: Goddess of Victory)
This is the dataset of novel/ノベル/诺薇儿/노벨 (Nikke: Goddess of Victory), containing 27 images and their tags.
The core tags of this character are `bangs, hat, green_eyes, hair_bun, brown_hair, breasts, blonde_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 27 | 43.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/novel_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 27 | 21.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/novel_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 64 | 46.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/novel_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 27 | 35.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/novel_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 64 | 72.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/novel_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/novel_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 27 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, open_mouth, thighhighs, smile, holding, skirt, +_+, long_sleeves, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | open_mouth | thighhighs | smile | holding | skirt | +_+ | long_sleeves | simple_background | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:-------------|:-------------|:--------|:----------|:--------|:------|:---------------|:--------------------|:-------------------|
| 0 | 27 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
dmrau/cqadupstack-gis-qrels | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: query-id
dtype: string
- name: corpus-id
dtype: string
- name: score
dtype: int64
splits:
- name: test
num_bytes: 28952
num_examples: 1114
download_size: 0
dataset_size: 28952
---
# Dataset Card for "cqadupstack-gis-qrels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Santp98/model_validation_ranked_ds | ---
dataset_info:
features:
- name: rank_1
dtype: string
- name: rank_2
dtype: string
- name: rank_3
dtype: string
- name: rank_4
dtype: string
- name: rank_5
dtype: string
- name: rank_6
dtype: string
- name: rank_7
dtype: string
- name: rank_8
dtype: string
- name: rank_9
dtype: string
- name: rank_10
dtype: string
- name: rank_11
dtype: string
- name: rank_12
dtype: string
- name: rank_13
dtype: string
- name: rank_14
dtype: string
- name: rank_15
dtype: string
- name: rank_16
dtype: string
- name: rank_17
dtype: string
- name: rank_18
dtype: string
- name: rank_19
dtype: string
- name: rank_20
dtype: string
- name: rank_21
dtype: string
- name: rank_22
dtype: string
- name: rank_23
dtype: string
- name: rank_24
dtype: string
- name: rank_25
dtype: string
- name: rank_26
dtype: string
- name: rank_27
dtype: string
- name: rank_28
dtype: string
- name: rank_29
dtype: string
- name: rank_30
dtype: string
- name: rank_31
dtype: string
- name: rank_32
dtype: string
- name: rank_33
dtype: string
- name: rank_34
dtype: string
- name: rank_35
dtype: string
- name: rank_36
dtype: string
- name: rank_37
dtype: string
- name: rank_38
dtype: string
- name: rank_39
dtype: string
- name: rank_40
dtype: string
- name: rank_41
dtype: string
- name: rank_42
dtype: string
- name: rank_43
dtype: string
- name: rank_44
dtype: string
- name: rank_45
dtype: string
- name: rank_46
dtype: string
- name: rank_47
dtype: string
- name: rank_48
dtype: string
- name: rank_49
dtype: string
- name: rank_50
dtype: string
- name: rank_51
dtype: string
- name: rank_52
dtype: string
- name: rank_53
dtype: string
- name: rank_54
dtype: string
- name: rank_55
dtype: string
- name: rank_56
dtype: string
- name: rank_57
dtype: string
- name: rank_58
dtype: string
- name: rank_59
dtype: string
- name: rank_60
dtype: string
- name: rank_61
dtype: string
- name: rank_62
dtype: string
- name: rank_63
dtype: string
- name: rank_64
dtype: string
- name: rank_65
dtype: string
- name: rank_66
dtype: string
- name: rank_67
dtype: string
- name: rank_68
dtype: string
- name: rank_69
dtype: string
- name: rank_70
dtype: string
- name: rank_71
dtype: string
- name: rank_72
dtype: string
- name: rank_73
dtype: string
- name: rank_74
dtype: string
- name: rank_75
dtype: string
- name: rank_76
dtype: string
- name: rank_77
dtype: string
- name: rank_78
dtype: string
- name: rank_79
dtype: string
- name: rank_80
dtype: string
- name: rank_81
dtype: string
- name: rank_82
dtype: string
- name: rank_83
dtype: string
- name: rank_84
dtype: string
- name: rank_85
dtype: string
- name: rank_86
dtype: string
- name: rank_87
dtype: string
- name: rank_88
dtype: string
- name: rank_89
dtype: string
- name: rank_90
dtype: string
- name: rank_91
dtype: string
- name: rank_92
dtype: string
- name: rank_93
dtype: string
- name: rank_94
dtype: string
- name: rank_95
dtype: string
- name: rank_96
dtype: string
- name: rank_97
dtype: string
- name: rank_98
dtype: string
- name: rank_99
dtype: string
- name: rank_100
dtype: string
- name: generated_queries
dtype: string
splits:
- name: train
num_bytes: 820598
num_examples: 500
download_size: 308559
dataset_size: 820598
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "model_validation_ranked_ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/penance_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of penance/ペナンス/斥罪 (Arknights)
This is the dataset of penance/ペナンス/斥罪 (Arknights), containing 362 images and their tags.
The core tags of this character are `animal_ears, wolf_ears, wolf_girl, yellow_eyes, long_hair, brown_hair, extra_ears, breasts, hair_between_eyes, braid, earrings, tail, wolf_tail, large_breasts, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 362 | 680.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/penance_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 362 | 558.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/penance_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 907 | 1.05 GiB | [Download](https://huggingface.co/datasets/CyberHarem/penance_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/penance_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | 1girl, black_jacket, solo, closed_mouth, simple_background, white_ascot, looking_at_viewer, upper_body, white_background, jewelry, white_shirt |
| 1 | 31 |  |  |  |  |  | black_jacket, solo, white_ascot, 1girl, long_sleeves, looking_at_viewer, open_book, black_thighhighs, holding_book, white_shirt, black_coat, skirt, single_gauntlet, open_coat, black_nails, simple_background, jewelry, white_background, black_footwear, closed_mouth, cowboy_shot, hand_up |
| 2 | 6 |  |  |  |  |  | 1girl, black_coat, black_jacket, black_nails, holding_book, nail_polish, solo, upper_body, white_ascot, closed_mouth, long_sleeves, looking_at_viewer, white_shirt, medium_breasts, open_book, animal_ear_fluff, jewelry, single_gauntlet |
| 3 | 6 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, official_alternate_costume, open_jacket, single_braid, solo, white_jacket, fur-trimmed_jacket, jewelry, side_braid, upper_body, black_nails, blush, nail_polish, asymmetrical_bangs, black_belt, black_dress, black_shirt, black_sweater, buckle, closed_mouth, hair_ornament, hand_up, medium_breasts, on_back, parted_lips, simple_background, turtleneck, white_background |
| 4 | 16 |  |  |  |  |  | 1girl, long_sleeves, official_alternate_costume, solo, blush, jewelry, single_braid, drinking_glass, holding_cup, holding_drink, looking_at_viewer, white_jacket, alcohol, black_sweater, open_jacket, parted_lips, simple_background, turtleneck, side_braid, nail_polish, black_nails, animal_ear_fluff, black_dress, drunk, fur-trimmed_jacket, hairclip, upper_body, white_background, asymmetrical_bangs, black_belt, lemon_slice, red_wine |
| 5 | 6 |  |  |  |  |  | 1girl, cleavage, long_sleeves, solo, white_shirt, black_bra, blush, collarbone, open_shirt, animal_ear_fluff, simple_background, white_background, closed_mouth, collared_shirt, looking_at_viewer, navel, upper_body |
| 6 | 8 |  |  |  |  |  | 1boy, black_hair, looking_at_viewer, male_focus, short_hair, solo, closed_mouth, black_gloves, green_eyes, infection_monitor_(arknights), long_sleeves, wolf_boy, black_jacket, ear_piercing, suit, black_coat, black_necktie, black_pants, collared_shirt, formal, fur-trimmed_coat, holding, rain |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_jacket | solo | closed_mouth | simple_background | white_ascot | looking_at_viewer | upper_body | white_background | jewelry | white_shirt | long_sleeves | open_book | black_thighhighs | holding_book | black_coat | skirt | single_gauntlet | open_coat | black_nails | black_footwear | cowboy_shot | hand_up | nail_polish | medium_breasts | animal_ear_fluff | official_alternate_costume | open_jacket | single_braid | white_jacket | fur-trimmed_jacket | side_braid | blush | asymmetrical_bangs | black_belt | black_dress | black_shirt | black_sweater | buckle | hair_ornament | on_back | parted_lips | turtleneck | drinking_glass | holding_cup | holding_drink | alcohol | drunk | hairclip | lemon_slice | red_wine | cleavage | black_bra | collarbone | open_shirt | collared_shirt | navel | 1boy | black_hair | male_focus | short_hair | black_gloves | green_eyes | infection_monitor_(arknights) | wolf_boy | ear_piercing | suit | black_necktie | black_pants | formal | fur-trimmed_coat | holding | rain |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:---------------|:--------------------|:--------------|:--------------------|:-------------|:-------------------|:----------|:--------------|:---------------|:------------|:-------------------|:---------------|:-------------|:--------|:------------------|:------------|:--------------|:-----------------|:--------------|:----------|:--------------|:-----------------|:-------------------|:-----------------------------|:--------------|:---------------|:---------------|:---------------------|:-------------|:--------|:---------------------|:-------------|:--------------|:--------------|:----------------|:---------|:----------------|:----------|:--------------|:-------------|:-----------------|:--------------|:----------------|:----------|:--------|:-----------|:--------------|:-----------|:-----------|:------------|:-------------|:-------------|:-----------------|:--------|:-------|:-------------|:-------------|:-------------|:---------------|:-------------|:--------------------------------|:-----------|:---------------|:-------|:----------------|:--------------|:---------|:-------------------|:----------|:-------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 31 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | | X | X | X | | X | X | X | X | | X | X | | X | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | X | X | | X | X | X | X | | X | | | | | | | | X | | | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 16 |  |  |  |  |  | X | | X | | X | | X | X | X | X | | X | | | | | | | | X | | | | X | | X | X | X | X | X | X | X | X | X | X | X | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | X | X | X | | X | X | X | | X | X | | | | | | | | | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | | X | X | X | | | X | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Dragon1218/Thai_QA_Set | ---
license: apache-2.0
language:
- th
pretty_name: QASet
--- |
AtlasUnified/atlas-preprocessed-code | ---
license: bigscience-openrail-m
---
|
maghwa/OpenHermes-2-AR-10K-32-740k-750k | ---
dataset_info:
features:
- name: language
dtype: 'null'
- name: model
dtype: 'null'
- name: system_prompt
dtype: 'null'
- name: category
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: skip_prompt_formatting
dtype: 'null'
- name: topic
dtype: 'null'
- name: idx
dtype: 'null'
- name: title
dtype: 'null'
- name: id
dtype: 'null'
- name: source
dtype: string
- name: views
dtype: float64
- name: custom_instruction
dtype: 'null'
- name: hash
dtype: 'null'
- name: conversations
dtype: string
- name: model_name
dtype: 'null'
splits:
- name: train
num_bytes: 25113147
num_examples: 10001
download_size: 11370472
dataset_size: 25113147
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
paul-w-qs/contracts_v9 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: JSON_LABEL
dtype: string
splits:
- name: train
num_bytes: 84923820.0
num_examples: 514
download_size: 84024431
dataset_size: 84923820.0
---
# Dataset Card for "contracts_v9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
darcycao/autotrain-data-hanz_en2zh | ---
language:
- zh
- en
task_categories:
- translation
---
# AutoTrain Dataset for project: hanz_en2zh
## Dataset Description
This dataset has been automatically processed by AutoTrain for project hanz_en2zh.
### Languages
The BCP-47 code for the dataset's language is zh2en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"source": "sarashi",
"target": "sarashi"
},
{
"source": "Dojo",
"target": "Dojo"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"source": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 98 |
| valid | 25 |
|
yiwang454/pixel_font | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: num_patches
dtype: int64
splits:
- name: train
num_bytes: 62177164.0
num_examples: 6696
download_size: 58030443
dataset_size: 62177164.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "pixel_font"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xlangai/spider | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
- machine-generated
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text2text-generation
task_ids: []
paperswithcode_id: spider-1
pretty_name: Spider
tags:
- text-to-sql
dataset_info:
config_name: spider
features:
- name: db_id
dtype: string
- name: query
dtype: string
- name: question
dtype: string
- name: query_toks
sequence: string
- name: query_toks_no_value
sequence: string
- name: question_toks
sequence: string
splits:
- name: train
num_bytes: 4743786
num_examples: 7000
- name: validation
num_bytes: 682090
num_examples: 1034
download_size: 957246
dataset_size: 5425876
configs:
- config_name: spider
data_files:
- split: train
path: spider/train-*
- split: validation
path: spider/validation-*
default: true
---
# Dataset Card for Spider
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://yale-lily.github.io/spider
- **Repository:** https://github.com/taoyds/spider
- **Paper:** https://www.aclweb.org/anthology/D18-1425/
- **Paper:** https://arxiv.org/abs/1809.08887
- **Point of Contact:** [Yale LILY](https://yale-lily.github.io/)
### Dataset Summary
Spider is a large-scale complex and cross-domain semantic parsing and text-to-SQL dataset annotated by 11 Yale students.
The goal of the Spider challenge is to develop natural language interfaces to cross-domain databases.
### Supported Tasks and Leaderboards
The leaderboard can be seen at https://yale-lily.github.io/spider
### Languages
The text in the dataset is in English.
## Dataset Structure
### Data Instances
**What do the instances that comprise the dataset represent?**
Each instance is natural language question and the equivalent SQL query
**How many instances are there in total?**
**What data does each instance consist of?**
[More Information Needed]
### Data Fields
* **db_id**: Database name
* **question**: Natural language to interpret into SQL
* **query**: Target SQL query
* **query_toks**: List of tokens for the query
* **query_toks_no_value**: List of tokens for the query
* **question_toks**: List of tokens for the question
### Data Splits
**train**: 7000 questions and SQL query pairs
**dev**: 1034 question and SQL query pairs
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
[More Information Needed]
### Annotations
The dataset was annotated by 11 college students at Yale University
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
## Additional Information
The listed authors in the homepage are maintaining/supporting the dataset.
### Dataset Curators
[More Information Needed]
### Licensing Information
The spider dataset is licensed under
the [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/legalcode)
[More Information Needed]
### Citation Information
```
@inproceedings{yu-etal-2018-spider,
title = "{S}pider: A Large-Scale Human-Labeled Dataset for Complex and Cross-Domain Semantic Parsing and Text-to-{SQL} Task",
author = "Yu, Tao and
Zhang, Rui and
Yang, Kai and
Yasunaga, Michihiro and
Wang, Dongxu and
Li, Zifan and
Ma, James and
Li, Irene and
Yao, Qingning and
Roman, Shanelle and
Zhang, Zilin and
Radev, Dragomir",
editor = "Riloff, Ellen and
Chiang, David and
Hockenmaier, Julia and
Tsujii, Jun{'}ichi",
booktitle = "Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing",
month = oct # "-" # nov,
year = "2018",
address = "Brussels, Belgium",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/D18-1425",
doi = "10.18653/v1/D18-1425",
pages = "3911--3921",
archivePrefix={arXiv},
eprint={1809.08887},
primaryClass={cs.CL},
}
```
### Contributions
Thanks to [@olinguyen](https://github.com/olinguyen) for adding this dataset. |
Tristan/olm-october-2022-tokenized | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 84051313200.0
num_examples: 23347587
download_size: 21176572924
dataset_size: 84051313200.0
---
# Dataset Card for "olm-october-2022-tokenized-olm-bert-base-uncased"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sudipchel/Mental-Health-800 | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3200
num_examples: 800
download_size: 715
dataset_size: 3200
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
huggingartists/freddie-dredd | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/freddie-dredd"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.261399 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/f198be5e1dfd71285efa66c8b223ae6d.400x400x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/freddie-dredd">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Freddie Dredd</div>
<a href="https://genius.com/artists/freddie-dredd">
<div style="text-align: center; font-size: 14px;">@freddie-dredd</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/freddie-dredd).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/freddie-dredd")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|212| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/freddie-dredd")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
tori29umai/CounterfeitXL-V1.0_canny_noline_dataset | ---
license: openrail
---
|
sushobhan55/Jordan-Peterson-Conversation-for-NLP | ---
license: apache-2.0
---
This dataset contains dialogues from Jordan Peterson through either quora answers or interview transcripts. The dataset was manually created to imitate conversation. |
Ushanka117/gregorfromlimbert | ---
license: openrail
---
|
jlbaker361/actstu-openjourney | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: seed
dtype: int64
- name: steps
dtype: int64
splits:
- name: train
num_bytes: 13477522.0
num_examples: 28
download_size: 13479202
dataset_size: 13477522.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/lila_decyrus_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lila_decyrus/リラ・ディザイアス/莉拉·德西亚斯 (Azur Lane)
This is the dataset of lila_decyrus/リラ・ディザイアス/莉拉·德西亚斯 (Azur Lane), containing 210 images and their tags.
The core tags of this character are `long_hair, bangs, breasts, hair_ornament, hairclip, crossed_bangs, large_breasts, low_twintails, twintails, heterochromia, pale_skin, red_eyes, hair_between_eyes, blue_eyes, very_long_hair, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 210 | 284.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lila_decyrus_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 210 | 162.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lila_decyrus_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 488 | 327.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lila_decyrus_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 210 | 251.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lila_decyrus_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 488 | 461.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lila_decyrus_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lila_decyrus_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1boy, 1girl, blush, hetero, looking_at_viewer, navel, nipples, penis, pussy, sex, solo_focus, sweat, vaginal, pov, spread_legs, thighs, completely_nude, cowgirl_position, girl_on_top, mosaic_censoring, collarbone, open_mouth |
| 1 | 11 |  |  |  |  |  | 1boy, 1girl, solo_focus, pov, huge_breasts, bare_shoulders, blush, cum_on_breasts, penis, breasts_squeezed_together, looking_at_viewer, paizuri_under_clothes, closed_mouth, fur, nipples, nude |
| 2 | 21 |  |  |  |  |  | 1girl, solo, looking_at_viewer, bare_shoulders, bodysuit, black_nails, bodystocking, fur, closed_mouth, fingernails, cleavage, sitting |
| 3 | 10 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, bodysuit, simple_background, white_background, bodystocking, cleavage, closed_mouth |
| 4 | 8 |  |  |  |  |  | 1girl, blush, bodysuit, looking_at_viewer, simple_background, solo, white_background, thighs, bare_shoulders, fur, cleavage, open_mouth, parted_lips |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | blush | hetero | looking_at_viewer | navel | nipples | penis | pussy | sex | solo_focus | sweat | vaginal | pov | spread_legs | thighs | completely_nude | cowgirl_position | girl_on_top | mosaic_censoring | collarbone | open_mouth | huge_breasts | bare_shoulders | cum_on_breasts | breasts_squeezed_together | paizuri_under_clothes | closed_mouth | fur | nude | solo | bodysuit | black_nails | bodystocking | fingernails | cleavage | sitting | simple_background | white_background | parted_lips |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:--------|:---------|:--------------------|:--------|:----------|:--------|:--------|:------|:-------------|:--------|:----------|:------|:--------------|:---------|:------------------|:-------------------|:--------------|:-------------------|:-------------|:-------------|:---------------|:-----------------|:-----------------|:----------------------------|:------------------------|:---------------|:------|:-------|:-------|:-----------|:--------------|:---------------|:--------------|:-----------|:----------|:--------------------|:-------------------|:--------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | | X | | X | X | | | X | | | X | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 2 | 21 |  |  |  |  |  | | X | | | X | | | | | | | | | | | | | | | | | | | X | | | | X | X | | X | X | X | X | X | X | X | | | |
| 3 | 10 |  |  |  |  |  | | X | | | X | | | | | | | | | | | | | | | | | | | X | | | | X | | | X | X | | X | | X | | X | X | |
| 4 | 8 |  |  |  |  |  | | X | X | | X | | | | | | | | | | | X | | | | | | X | | X | | | | | X | | X | X | | | | X | | X | X | X |
|
Ahmet29/Model | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.