datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
liblinear/eng-russian-paintings-t2i-last-1000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 135897897.64
num_examples: 1170
download_size: 133626115
dataset_size: 135897897.64
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liersan/litest | ---
dataset_info:
features:
- name: audio
sequence: float32
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 113600198
num_examples: 534
- name: trian
num_bytes: 113600198
num_examples: 534
download_size: 114111784
dataset_size: 227200396
---
# Dataset Card for "litest"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EleutherAI/quirky_capitals_alice | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: bob_label
dtype: bool
- name: alice_label
dtype: bool
- name: difficulty
dtype: float64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 56487.163245356795
num_examples: 512
- name: validation
num_bytes: 109924.0
num_examples: 1000
- name: test
num_bytes: 110136.0
num_examples: 1000
download_size: 98973
dataset_size: 276547.16324535676
---
# Dataset Card for "quirky_capitals_alice"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wenjiewu/dataset_f | ---
license: mit
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_22 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1072940412.0
num_examples: 210711
download_size: 1094089323
dataset_size: 1072940412.0
---
# Dataset Card for "chunk_22"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/orca_max_300 | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 5127997192.312375
num_examples: 3006598
- name: test
num_bytes: 269894589.06907237
num_examples: 158242
- name: validation
num_bytes: 269894589.06907237
num_examples: 158242
download_size: 90547829
dataset_size: 5667786370.4505205
---
# Dataset Card for "orca_max_300"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
allennghayoui/formatted_dataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 194135
num_examples: 192
download_size: 42212
dataset_size: 194135
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_allknowingroger__TaoPassthrough-15B-s | ---
pretty_name: Evaluation run of allknowingroger/TaoPassthrough-15B-s
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [allknowingroger/TaoPassthrough-15B-s](https://huggingface.co/allknowingroger/TaoPassthrough-15B-s)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_allknowingroger__TaoPassthrough-15B-s\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-11T08:38:19.718412](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__TaoPassthrough-15B-s/blob/main/results_2024-04-11T08-38-19.718412.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6459216801334029,\n\
\ \"acc_stderr\": 0.0323164135334774,\n \"acc_norm\": 0.6480689895845723,\n\
\ \"acc_norm_stderr\": 0.03298589284457266,\n \"mc1\": 0.5924112607099143,\n\
\ \"mc1_stderr\": 0.017201949234553107,\n \"mc2\": 0.7474836944840552,\n\
\ \"mc2_stderr\": 0.014375017232568123\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6902730375426621,\n \"acc_stderr\": 0.01351205841523836,\n\
\ \"acc_norm\": 0.7175767918088737,\n \"acc_norm_stderr\": 0.013155456884097222\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7171878111929895,\n\
\ \"acc_stderr\": 0.004494454911844617,\n \"acc_norm\": 0.888568014339773,\n\
\ \"acc_norm_stderr\": 0.003140232392568799\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901409,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901409\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.028985455652334388,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.028985455652334388\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n\
\ \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n\
\ \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n\
\ \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n\
\ \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n\
\ \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n\
\ \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5517241379310345,\n \"acc_stderr\": 0.041443118108781526,\n \"\
acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.041443118108781526\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138215,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138215\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7935483870967742,\n \"acc_stderr\": 0.02302589961718872,\n \"\
acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.02302589961718872\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.03095405547036589,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.03095405547036589\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328974,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328974\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652456,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652456\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"\
acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\"\
: 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n\
\ \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n\
\ \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8143459915611815,\n \"acc_stderr\": 0.02531049537694486,\n\
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.02531049537694486\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579832,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579832\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3754189944134078,\n\
\ \"acc_stderr\": 0.01619510424846353,\n \"acc_norm\": 0.3754189944134078,\n\
\ \"acc_norm_stderr\": 0.01619510424846353\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399655,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399655\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49022164276401564,\n\
\ \"acc_stderr\": 0.01276779378772933,\n \"acc_norm\": 0.49022164276401564,\n\
\ \"acc_norm_stderr\": 0.01276779378772933\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7075163398692811,\n \"acc_stderr\": 0.018403415710109797,\n \
\ \"acc_norm\": 0.7075163398692811,\n \"acc_norm_stderr\": 0.018403415710109797\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169146,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5924112607099143,\n\
\ \"mc1_stderr\": 0.017201949234553107,\n \"mc2\": 0.7474836944840552,\n\
\ \"mc2_stderr\": 0.014375017232568123\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8634569850039463,\n \"acc_stderr\": 0.009650242900291598\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5003790750568613,\n \
\ \"acc_stderr\": 0.013772480761626175\n }\n}\n```"
repo_url: https://huggingface.co/allknowingroger/TaoPassthrough-15B-s
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|arc:challenge|25_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|gsm8k|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hellaswag|10_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T08-38-19.718412.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-11T08-38-19.718412.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- '**/details_harness|winogrande|5_2024-04-11T08-38-19.718412.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-11T08-38-19.718412.parquet'
- config_name: results
data_files:
- split: 2024_04_11T08_38_19.718412
path:
- results_2024-04-11T08-38-19.718412.parquet
- split: latest
path:
- results_2024-04-11T08-38-19.718412.parquet
---
# Dataset Card for Evaluation run of allknowingroger/TaoPassthrough-15B-s
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [allknowingroger/TaoPassthrough-15B-s](https://huggingface.co/allknowingroger/TaoPassthrough-15B-s) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_allknowingroger__TaoPassthrough-15B-s",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-11T08:38:19.718412](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__TaoPassthrough-15B-s/blob/main/results_2024-04-11T08-38-19.718412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6459216801334029,
"acc_stderr": 0.0323164135334774,
"acc_norm": 0.6480689895845723,
"acc_norm_stderr": 0.03298589284457266,
"mc1": 0.5924112607099143,
"mc1_stderr": 0.017201949234553107,
"mc2": 0.7474836944840552,
"mc2_stderr": 0.014375017232568123
},
"harness|arc:challenge|25": {
"acc": 0.6902730375426621,
"acc_stderr": 0.01351205841523836,
"acc_norm": 0.7175767918088737,
"acc_norm_stderr": 0.013155456884097222
},
"harness|hellaswag|10": {
"acc": 0.7171878111929895,
"acc_stderr": 0.004494454911844617,
"acc_norm": 0.888568014339773,
"acc_norm_stderr": 0.003140232392568799
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901409,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901409
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.028985455652334388,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.028985455652334388
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.041443118108781526,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.041443118108781526
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138215,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138215
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.03095405547036589,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.03095405547036589
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328974,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328974
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652456,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.02531049537694486,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.02531049537694486
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699796,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579832,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579832
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3754189944134078,
"acc_stderr": 0.01619510424846353,
"acc_norm": 0.3754189944134078,
"acc_norm_stderr": 0.01619510424846353
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399655,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399655
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49022164276401564,
"acc_stderr": 0.01276779378772933,
"acc_norm": 0.49022164276401564,
"acc_norm_stderr": 0.01276779378772933
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7075163398692811,
"acc_stderr": 0.018403415710109797,
"acc_norm": 0.7075163398692811,
"acc_norm_stderr": 0.018403415710109797
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169146,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835816,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835816
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5924112607099143,
"mc1_stderr": 0.017201949234553107,
"mc2": 0.7474836944840552,
"mc2_stderr": 0.014375017232568123
},
"harness|winogrande|5": {
"acc": 0.8634569850039463,
"acc_stderr": 0.009650242900291598
},
"harness|gsm8k|5": {
"acc": 0.5003790750568613,
"acc_stderr": 0.013772480761626175
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yuwan0/lexica-stable-diffusion-v1-5 | ---
license: openrail
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 4940747199.862
num_examples: 11838
download_size: 4931661909
dataset_size: 4940747199.862
---
# Stable Diffusion Dataset
This is a set of about 80,000 Image-Prompt pairs generated by [stable-diffusion-v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5).
The Prompts come from dataset [Stable-Diffusion-Prompts](https://huggingface.co/datasets/Gustavosta/Stable-Diffusion-Prompts) which filtered and extracted from the image finder for Stable Diffusion: "[Lexica.art](https://lexica.art/)". |
CyberHarem/miyashita_ai_loveliveschoolidolfestivalallstars | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of miyashita_ai/宮下愛/미야시타아이 (Love Live! School Idol Festival ALL STARS)
This is the dataset of miyashita_ai/宮下愛/미야시타아이 (Love Live! School Idol Festival ALL STARS), containing 500 images and their tags.
The core tags of this character are `blonde_hair, bangs, yellow_eyes, breasts, sidelocks, orange_eyes, medium_hair, braid, hair_ornament, ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 804.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miyashita_ai_loveliveschoolidolfestivalallstars/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 382.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miyashita_ai_loveliveschoolidolfestivalallstars/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1322 | 903.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miyashita_ai_loveliveschoolidolfestivalallstars/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 678.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miyashita_ai_loveliveschoolidolfestivalallstars/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1322 | 1.39 GiB | [Download](https://huggingface.co/datasets/CyberHarem/miyashita_ai_loveliveschoolidolfestivalallstars/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/miyashita_ai_loveliveschoolidolfestivalallstars',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, off-shoulder_shirt, solo, cleavage, necklace, smile, x_hair_ornament, medium_breasts, midriff, orange_shirt, blush, bracelet, collarbone, gyaru, looking_at_viewer, navel, one_eye_closed, star_(symbol), cellphone, denim, holding_phone, upper_body |
| 1 | 14 |  |  |  |  |  | 1girl, solo, blush, jacket_around_waist, looking_at_viewer, midriff, navel, short_sleeves, smile, white_shirt, cleavage, collarbone, medium_breasts, crop_top, pants, wristband, open_mouth, simple_background, white_background |
| 2 | 35 |  |  |  |  |  | 1girl, nijigasaki_academy_school_uniform, solo, looking_at_viewer, collared_shirt, brown_cardigan, plaid_skirt, short_sleeves, gyaru, smile, summer_uniform, neck_ribbon, pleated_skirt, jacket_around_waist, simple_background, sweater_around_waist, white_shirt, blue_skirt, short_ponytail, white_background, blush, medium_breasts, flower |
| 3 | 7 |  |  |  |  |  | 1girl, brown_cardigan, school_uniform, solo, upper_body, blush, collared_shirt, smile, white_background, white_shirt, looking_at_viewer, red_ribbon, simple_background, closed_mouth, large_breasts, neck_ribbon |
| 4 | 26 |  |  |  |  |  | 1girl, solo, hair_flower, cleavage, smile, looking_at_viewer, navel, midriff, medium_breasts, short_shorts, star_(symbol), bracelet, collarbone, one_eye_closed, black_shorts, boots, side_ponytail, blush, gyaru, orange_nails, black_footwear, heart_tattoo, necklace, ring, underwear |
| 5 | 8 |  |  |  |  |  | 1girl, solo, english_text, looking_at_viewer, smile, character_name, happy_birthday, blush, dress, dated, hat, jewelry, side_braid, white_gloves, medium_breasts, side_ponytail |
| 6 | 5 |  |  |  |  |  | 1girl, beanie, blue_headwear, long_sleeves, looking_at_viewer, solo, blush, shirt, long_hair, off_shoulder, white_background, collarbone, green_pants, grin, simple_background, sitting, sneakers, tank_top |
| 7 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, dress, blush, white_gloves, heart, open_mouth |
| 8 | 61 |  |  |  |  |  | 1girl, solo, tank_top, midriff, cheerleader, orange_skirt, french_braid, side_ponytail, wristband, collarbone, gyaru, miniskirt, pom_pom_(cheerleading), heart_necklace, navel, hairclip, black_belt, thighhighs, smile, hair_tie, off_shoulder, crop_top, pendant, asymmetrical_legwear |
| 9 | 7 |  |  |  |  |  | 1girl, black_gloves, fingerless_gloves, looking_at_viewer, solo, x_hair_ornament, black_choker, cropped_jacket, red_nails, crop_top, heart_necklace, midriff, navel, one_eye_closed, short_ponytail, belt, character_name, collarbone, french_braid, hairclip, miniskirt, nail_polish, ribbon, black_skirt, blush, bracelet, earrings, fishnet_top, grey_tank_top, grin, hair_bow, layered_skirt, side_braid |
| 10 | 15 |  |  |  |  |  | 1girl, solo, jacket, bracelet, looking_at_viewer, skirt, twintails, demon_horns, demon_tail, nail_polish, fake_horns, fishnet_pantyhose, necklace, smile, black_choker, black_nails, black_necktie, boots, gyaru, holding_weapon, one_eye_closed, white_shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | off-shoulder_shirt | solo | cleavage | necklace | smile | x_hair_ornament | medium_breasts | midriff | orange_shirt | blush | bracelet | collarbone | gyaru | looking_at_viewer | navel | one_eye_closed | star_(symbol) | cellphone | denim | holding_phone | upper_body | jacket_around_waist | short_sleeves | white_shirt | crop_top | pants | wristband | open_mouth | simple_background | white_background | nijigasaki_academy_school_uniform | collared_shirt | brown_cardigan | plaid_skirt | summer_uniform | neck_ribbon | pleated_skirt | sweater_around_waist | blue_skirt | short_ponytail | flower | school_uniform | red_ribbon | closed_mouth | large_breasts | hair_flower | short_shorts | black_shorts | boots | side_ponytail | orange_nails | black_footwear | heart_tattoo | ring | underwear | english_text | character_name | happy_birthday | dress | dated | hat | jewelry | side_braid | white_gloves | beanie | blue_headwear | long_sleeves | shirt | long_hair | off_shoulder | green_pants | grin | sitting | sneakers | tank_top | heart | cheerleader | orange_skirt | french_braid | miniskirt | pom_pom_(cheerleading) | heart_necklace | hairclip | black_belt | thighhighs | hair_tie | pendant | asymmetrical_legwear | black_gloves | fingerless_gloves | black_choker | cropped_jacket | red_nails | belt | nail_polish | ribbon | black_skirt | earrings | fishnet_top | grey_tank_top | hair_bow | layered_skirt | jacket | skirt | twintails | demon_horns | demon_tail | fake_horns | fishnet_pantyhose | black_nails | black_necktie | holding_weapon |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------------|:-------|:-----------|:-----------|:--------|:------------------|:-----------------|:----------|:---------------|:--------|:-----------|:-------------|:--------|:--------------------|:--------|:-----------------|:----------------|:------------|:--------|:----------------|:-------------|:----------------------|:----------------|:--------------|:-----------|:--------|:------------|:-------------|:--------------------|:-------------------|:------------------------------------|:-----------------|:-----------------|:--------------|:-----------------|:--------------|:----------------|:-----------------------|:-------------|:-----------------|:---------|:-----------------|:-------------|:---------------|:----------------|:--------------|:---------------|:---------------|:--------|:----------------|:---------------|:-----------------|:---------------|:-------|:------------|:---------------|:-----------------|:-----------------|:--------|:--------|:------|:----------|:-------------|:---------------|:---------|:----------------|:---------------|:--------|:------------|:---------------|:--------------|:-------|:----------|:-----------|:-----------|:--------|:--------------|:---------------|:---------------|:------------|:-------------------------|:-----------------|:-----------|:-------------|:-------------|:-----------|:----------|:-----------------------|:---------------|:--------------------|:---------------|:-----------------|:------------|:-------|:--------------|:---------|:--------------|:-----------|:--------------|:----------------|:-----------|:----------------|:---------|:--------|:------------|:--------------|:-------------|:-------------|:--------------------|:--------------|:----------------|:-----------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | | X | X | | X | | X | X | | X | | X | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 35 |  |  |  |  |  | X | | X | | | X | | X | | | X | | | X | X | | | | | | | | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | X | | | X | | | | | X | | | | X | | | | | | | X | | | X | | | | | X | X | | X | X | | | X | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 26 |  |  |  |  |  | X | | X | X | X | X | | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | X | | | X | | X | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | X | | | | | | | | X | | X | | X | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | | X | | | X | | | | | X | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 61 |  |  |  |  |  | X | | X | | | X | | | X | | | | X | X | | X | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | | X | | | | X | | X | | X | X | X | | X | X | X | | | | | | | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | X | | | | | | | X | X | | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 10 | 15 |  |  |  |  |  | X | | X | | X | X | | | | | | X | | X | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
czyzi0/the-mc-speech-dataset | ---
language:
- pl
license: cc0-1.0
size_categories:
- 10K<n<100K
task_categories:
- text-to-speech
- automatic-speech-recognition
pretty_name: The MC Speech Dataset
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 44100
- name: transcript
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 6985316587.668
num_examples: 24018
download_size: 6174661195
dataset_size: 6985316587.668
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
This is public domain speech dataset consisting of 24018 short audio clips of a single speaker reading sentences in Polish. A transcription is provided for each clip. Clips have total length of more than 22 hours.
Texts are in public domain. The audio was recorded in 2021-22 as a part of my [master's thesis](http://dx.doi.org/10.13140/RG.2.2.26293.24800) and is in public domain.
If you use this dataset, please cite:
```
@masterthesis{mcspeech,
title={Analiza porównawcza korpusów nagrań mowy dla celów syntezy mowy w języku polskim},
author={Czyżnikiewicz, Mateusz},
year={2022},
month={December},
school={Warsaw University of Technology},
type={Master's thesis},
doi={10.13140/RG.2.2.26293.24800},
note={Available at \url{http://dx.doi.org/10.13140/RG.2.2.26293.24800}},
}
```
More info about the dataset can be found at https://github.com/czyzi0/the-mc-speech-dataset
Also, if you find this resource helpful, kindly consider leaving a like. |
dim/mt_bench_ru | ---
dataset_info:
features:
- name: question_id
dtype: int64
- name: category
dtype: string
- name: turns
sequence: string
- name: turns_ru
sequence: string
splits:
- name: train
num_bytes: 95817
num_examples: 80
download_size: 55916
dataset_size: 95817
---
# Dataset Card for "mt_bench_ru"
Автоматически переведенный датасет при помощи facebook/wmt21-dense-24-wide-en-x и потом поправленный мной лично в некоторых местах.
Если вы хотите исправить данный датасет, то вы можете использовать данную гугл таблицу https://docs.google.com/spreadsheets/d/1C2znaufnvMU2PyqaDKMTrRKPvS60xtisdcRSlqQGUUs/edit?usp=sharing |
Isaak-Carter/JOSIE_Wizard_Vicuna_unfiltered_with_greetings_70k_v3 | ---
dataset_info:
features:
- name: sample
dtype: string
splits:
- name: train
num_bytes: 458440925
num_examples: 153908
download_size: 211543673
dataset_size: 458440925
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
```text
\n<|context|>\ncurrent time: {time}<|endoftext|>\n<|gökdeniz|>\n{text}<|endoftext|>\n<|josie|>\n{text}<|endoftext|>
```
|
autoevaluate/autoeval-staging-eval-project-a02353d8-c94a-4476-bd14-15028ee3f918-5452 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- autoevaluate/zero-shot-classification-sample
eval_info:
task: text_zero_shot_classification
model: autoevaluate/zero-shot-classification
metrics: []
dataset_name: autoevaluate/zero-shot-classification-sample
dataset_config: autoevaluate--zero-shot-classification-sample
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: autoevaluate/zero-shot-classification
* Dataset: autoevaluate/zero-shot-classification-sample
* Config: autoevaluate--zero-shot-classification-sample
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
hjhkoream/thatyear | ---
dataset_info:
features:
- name: episode
dtype: int64
- name: name
dtype: string
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 1712756269.6689477
num_examples: 8110
- name: valid
num_bytes: 571059550.3310523
num_examples: 2704
- name: test
num_bytes: 677624192
num_examples: 3240
download_size: 2269249208
dataset_size: 2961440012.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
StemGene/eurosat-demo | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': AnnualCrop
'1': Forest
'2': HerbaceousVegetation
'3': Highway
'4': Industrial
'5': Pasture
'6': PermanentCrop
'7': Residential
'8': River
'9': SeaLake
splits:
- name: train
num_bytes: 92168360.0
num_examples: 27000
download_size: 0
dataset_size: 92168360.0
---
# Dataset Card for "eurosat-demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bertbsb/Herbertbetto | ---
license: openrail
---
|
alexandreteles/image-generation-intent | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: image_intent
dtype: float64
splits:
- name: train
num_bytes: 6071714
num_examples: 72928
download_size: 2468592
dataset_size: 6071714
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
library_name: datadreamer
size_categories:
- 10K<n<100K
tags:
- datadreamer
- synthetic
- open-mixtral-8x7b
---
|
MBSPIE/GK_reel | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 929673642.768
num_examples: 1026
- name: validation
num_bytes: 2934980.0
num_examples: 42
download_size: 906545626
dataset_size: 932608622.768
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
hhhwmws/xuzhu | ---
license: cc-by-4.0
task_categories:
- text-generation
language:
- zh
size_categories:
- 1K<n<10K
---
支持ChatHaruhi2 的虚竹数据,可以使用如下方式调用
```python
from chatharuhi import ChatHaruhi
chatbot = ChatHaruhi( role_from_hf = 'hhhwmws/xuzhu', \
llm = 'openai')
response = chatbot.chat(role='僧人', text = '你好!')
print(response)
```
上传者: 米唯实
更具体的信息,见 [ChatHaruhi](https://github.com/LC1332/Chat-Haruhi-Suzumiya)
欢迎加入我们的 [众筹角色创建项目](https://github.com/LC1332/Chat-Haruhi-Suzumiya/tree/main/characters/novel_collecting)
### Citation引用
Please cite the repo if you use the data or code in this repo.
```
@misc{li2023chatharuhi,
title={ChatHaruhi: Reviving Anime Character in Reality via Large Language Model},
author={Cheng Li and Ziang Leng and Chenxi Yan and Junyi Shen and Hao Wang and Weishi MI and Yaying Fei and Xiaoyang Feng and Song Yan and HaoSheng Wang and Linkang Zhan and Yaokai Jia and Pingyu Wu and Haozhen Sun},
year={2023},
eprint={2308.09597},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
DylanonWic/common_voice_6_1_th_test | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: input_ids
sequence: int32
- name: input_values
sequence: float32
splits:
- name: test
num_bytes: 586434697
num_examples: 2050
download_size: 559868982
dataset_size: 586434697
---
# Dataset Card for "common_voice_6_1_th_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_cola_drop_copula_be_locative | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 351
num_examples: 6
- name: test
num_bytes: 617
num_examples: 9
- name: train
num_bytes: 4790
num_examples: 63
download_size: 8445
dataset_size: 5758
---
# Dataset Card for "MULTI_VALUE_cola_drop_copula_be_locative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dmayhem93/agieval-gaokao-physics | ---
dataset_info:
features:
- name: query
dtype: string
- name: choices
sequence: string
- name: gold
sequence: int64
splits:
- name: test
num_bytes: 136757
num_examples: 200
download_size: 70363
dataset_size: 136757
license: mit
---
# Dataset Card for "agieval-gaokao-physics"
Dataset taken from https://github.com/microsoft/AGIEval and processed as in that repo.
MIT License
Copyright (c) Microsoft Corporation.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE
@misc{zhong2023agieval,
title={AGIEval: A Human-Centric Benchmark for Evaluating Foundation Models},
author={Wanjun Zhong and Ruixiang Cui and Yiduo Guo and Yaobo Liang and Shuai Lu and Yanlin Wang and Amin Saied and Weizhu Chen and Nan Duan},
year={2023},
eprint={2304.06364},
archivePrefix={arXiv},
primaryClass={cs.CL}
} |
thangvip/data-kalapa-medical-chunked | ---
dataset_info:
features:
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 9804125
num_examples: 4399
download_size: 4338224
dataset_size: 9804125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-kalapa-medical-chunked"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hheiden/us-congress-117-bills | ---
license: mit
task_categories:
- text-classification
language:
- en
tags:
- legal
pretty_name: US 117th Congress Bills
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset US 117th Congress Bills
## Dataset Description
- **Homepage:** https://hunterheidenreich.com/posts/us-117th-congress-data-exploration/
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** Hunter Heidenreich
### Dataset Summary
The US 117th Congress Bills dataset is a collection of all of the House Resolutions, House Joint Resolutions,
Senate Resolutions, and Senate Joint Resolutions introduced during the 117th Congress (2021-2022).
The task is to classify each bill into one of thirty-three major policy areas.
There are 11,389 bills in the training split and 3,797 bills in the testing split.
### Supported Tasks and Leaderboards
- `text-classification`: The goal is to classify each bill into one of thirty-three major policy areas. The dataset contains both a text label (`policy_areas`) and a class integer (`y`).
These classes correspond to:
- 0: Agriculture and Food
- 1: Animals
- 2: Armed Forces and National Security
- 3: Arts, Culture, Religion
- 4: Civil Rights and Liberties, Minority Issues
- 5: Commerce
- 6: Congress
- 7: Crime and Law Enforcement
- 8: Economics and Public Finance
- 9: Education
- 10: Emergency Management
- 11: Energy
- 12: Environmental Protection
- 13: Families
- 14: Finance and Financial Sector
- 15: Foreign Trade and International Finance
- 16: Government Operations and Politics
- 17: Health
- 18: Housing and Community Development
- 19: Immigration
- 20: International Affairs
- 21: Labor and Employment
- 22: Law
- 23: Native Americans
- 24: Private Legislation
- 25: Public Lands and Natural Resources
- 26: Science, Technology, Communications
- 27: Social Sciences and History
- 28: Social Welfare
- 29: Sports and Recreation
- 30: Taxation
- 31: Transportation and Public Works
- 32: Water Resources Development
There is no leaderboard currently.
### Languages
English
## Dataset Structure
### Data Instances
```
index 11047
id H.R.4536
policy_areas Social Welfare
cur_summary Welfare for Needs not Weed Act\nThis bill proh...
cur_text To prohibit assistance provided under the prog...
title Welfare for Needs not Weed Act
titles_official To prohibit assistance provided under the prog...
titles_short Welfare for Needs not Weed Act
sponsor_name Rep. Rice, Tom
sponsor_party R
sponsor_state SC
Name: 0, dtype: object
```
### Data Fields
- `index`: A numeric index
- `id`: The unique bill ID as a string
- `policy_areas`: The key policy area as a string. This is the classification label.
- `cur_summary`: The latest summary of the bill as a string.
- `cur_text`: The latest text of the bill as a string.
- `title`: The core title of the bill, as labeled on [Congress.gov](congress.gov), as a string.
- `titles_official`: All official titles of the bill (or nested legislation) as a string.
- `titles_short`: All short titles of the bill (or nested legislation) as a string.
- `sponsor_name`: The name of the primary representative sponsoring the legislation as a string.
- `sponsor_party`: The party of the primary sponsor as a string.
- `sponsor_state`: The home state of the primary sponsor as a string.
### Data Splits
The dataset was split into a training and testing split using a stratefied sampling, due to the class imbalance in the dataset.
Using scikit-learn, a quarter of the data (by class) is reserved for testing:
```
train_ix, test_ix = train_test_split(ixs, test_size=0.25, stratify=df['y'], random_state=1234567)
```
## Dataset Creation
### Curation Rationale
This dataset was created to provide a new dataset at the intersection of NLP and legislation.
Using this data for a simple major topic classification seemed like a practical first step.
### Source Data
#### Initial Data Collection and Normalization
Data was collected from [congress.gov](congress.gov) with minimal pre-processing.
Additional information about this datasets collection is discussed [here](https://hunterheidenreich.com/posts/us-117th-congress-data-exploration/#data---how-it-was-obtained).
#### Who are the source language producers?
Either [Congressional Research Service](https://www.congress.gov/help/legislative-glossary#glossary_crs) or other congressional staffers.
### Annotations
#### Who are the annotators?
Congressional Staff
### Personal and Sensitive Information
None, this is publicly available text through [congress.gov](congress.gov).
## Additional Information
### Licensing Information
MIT License |
Multimodal-Fatima/cv-as-nlp-vqa-example | ---
dataset_info:
features:
- name: multiple_choice_answer
dtype: string
- name: answers
sequence: string
- name: id_image
dtype: int64
- name: question_id
dtype: int64
- name: question
dtype: string
- name: image
dtype: image
- name: id
dtype: int64
- name: blip_caption_Salesforce_blip_image_captioning_large_intensive
sequence: string
splits:
- name: validation
num_bytes: 161239813.0
num_examples: 1000
download_size: 155287446
dataset_size: 161239813.0
---
# Dataset Card for "cv-as-nlp-vqa-example"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
baffo32/dataset_test | ---
dataset_info:
features:
- name: text
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 11671025
num_examples: 1129
download_size: 156579
dataset_size: 11671025
---
# Dataset Card for "dataset_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mnli_plural_interrogative | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 100475
num_examples: 449
- name: dev_mismatched
num_bytes: 119576
num_examples: 540
- name: test_matched
num_bytes: 131919
num_examples: 565
- name: test_mismatched
num_bytes: 131164
num_examples: 571
- name: train
num_bytes: 4944373
num_examples: 21350
download_size: 3164077
dataset_size: 5427507
---
# Dataset Card for "MULTI_VALUE_mnli_plural_interrogative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
minimario/apps_partial_0_300 | ---
dataset_info:
features:
- name: problem
dtype: string
- name: code
dtype: string
- name: label
dtype: int64
- name: full_sample
dtype: string
- name: where_from
dtype: string
splits:
- name: train
num_bytes: 685566901
num_examples: 533723
download_size: 0
dataset_size: 685566901
---
# Dataset Card for "apps_partial_0_300"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
quan246/doc_train | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: translation
struct:
- name: en
dtype: string
- name: vi
dtype: string
splits:
- name: train
num_bytes: 351140
num_examples: 1000
- name: dev
num_bytes: 31689
num_examples: 100
download_size: 221537
dataset_size: 382829
---
# Dataset Card for "doc_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heoji/ko_text | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: aihub_koen
num_bytes: 12601002
num_examples: 83332
- name: mw_textbook
num_bytes: 1722063576
num_examples: 395985
- name: korNLI
num_bytes: 40550933
num_examples: 193076
- name: kullm
num_bytes: 197495319
num_examples: 152630
- name: mmlu_all
num_bytes: 406126621
num_examples: 97765
download_size: 905702609
dataset_size: 2378837451
configs:
- config_name: default
data_files:
- split: aihub_koen
path: data/aihub_koen-*
- split: mw_textbook
path: data/mw_textbook-*
- split: korNLI
path: data/korNLI-*
- split: kullm
path: data/kullm-*
- split: mmlu_all
path: data/mmlu_all-*
---
|
wac81/tumours_chinese | ---
license: gpl-3.0
---
|
isashap/pleasework | ---
language:
- en
pretty_name: AI Resume
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Doub7e/SDv2-Spatial-Iterative-1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: T5_last_hidden_states
sequence:
sequence:
sequence: float32
splits:
- name: train
num_bytes: 1240283507.0
num_examples: 1000
download_size: 1074022406
dataset_size: 1240283507.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/tamayo_demonslayer | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tamayo (Kimetsu no Yaiba)
This is the dataset of tamayo (Kimetsu no Yaiba), containing 36 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
bruno17eyef/cristianoronaldo | ---
license: openrail
---
|
open-llm-leaderboard/details_KoboldAI__GPT-J-6B-Shinen | ---
pretty_name: Evaluation run of KoboldAI/GPT-J-6B-Shinen
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KoboldAI/GPT-J-6B-Shinen](https://huggingface.co/KoboldAI/GPT-J-6B-Shinen) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KoboldAI__GPT-J-6B-Shinen\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-21T16:38:56.875450](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__GPT-J-6B-Shinen/blob/main/results_2023-10-21T16-38-56.875450.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.00033145814652192477,\n \"f1\": 0.047103607382550344,\n\
\ \"f1_stderr\": 0.001175475504491836,\n \"acc\": 0.330297940428669,\n\
\ \"acc_stderr\": 0.00865604909042797\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652192477,\n\
\ \"f1\": 0.047103607382550344,\n \"f1_stderr\": 0.001175475504491836\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.019711902956785442,\n \
\ \"acc_stderr\": 0.0038289829787357113\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6408839779005525,\n \"acc_stderr\": 0.013483115202120229\n\
\ }\n}\n```"
repo_url: https://huggingface.co/KoboldAI/GPT-J-6B-Shinen
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_21T16_38_56.875450
path:
- '**/details_harness|drop|3_2023-10-21T16-38-56.875450.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-21T16-38-56.875450.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_21T16_38_56.875450
path:
- '**/details_harness|gsm8k|5_2023-10-21T16-38-56.875450.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-21T16-38-56.875450.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:56:59.519326.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:56:59.519326.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T15:56:59.519326.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_21T16_38_56.875450
path:
- '**/details_harness|winogrande|5_2023-10-21T16-38-56.875450.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-21T16-38-56.875450.parquet'
- config_name: results
data_files:
- split: 2023_07_19T15_56_59.519326
path:
- results_2023-07-19T15:56:59.519326.parquet
- split: 2023_10_21T16_38_56.875450
path:
- results_2023-10-21T16-38-56.875450.parquet
- split: latest
path:
- results_2023-10-21T16-38-56.875450.parquet
---
# Dataset Card for Evaluation run of KoboldAI/GPT-J-6B-Shinen
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KoboldAI/GPT-J-6B-Shinen
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KoboldAI/GPT-J-6B-Shinen](https://huggingface.co/KoboldAI/GPT-J-6B-Shinen) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KoboldAI__GPT-J-6B-Shinen",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T16:38:56.875450](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__GPT-J-6B-Shinen/blob/main/results_2023-10-21T16-38-56.875450.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652192477,
"f1": 0.047103607382550344,
"f1_stderr": 0.001175475504491836,
"acc": 0.330297940428669,
"acc_stderr": 0.00865604909042797
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652192477,
"f1": 0.047103607382550344,
"f1_stderr": 0.001175475504491836
},
"harness|gsm8k|5": {
"acc": 0.019711902956785442,
"acc_stderr": 0.0038289829787357113
},
"harness|winogrande|5": {
"acc": 0.6408839779005525,
"acc_stderr": 0.013483115202120229
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
hojzas/sophie2 | ---
license: apache-2.0
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_2 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 907867956.0
num_examples: 178293
download_size: 922520202
dataset_size: 907867956.0
---
# Dataset Card for "chunk_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ytzi/the-stack-dedup-python-filtered-dec_gen_async | ---
dataset_info:
config_name: main
features:
- name: hexsha
dtype: string
- name: size
dtype: int64
- name: ext
dtype: string
- name: lang
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_repo_head_hexsha
dtype: string
- name: max_stars_repo_licenses
sequence: string
- name: max_stars_count
dtype: int64
- name: max_stars_repo_stars_event_min_datetime
dtype: string
- name: max_stars_repo_stars_event_max_datetime
dtype: string
- name: max_issues_repo_path
dtype: string
- name: max_issues_repo_name
dtype: string
- name: max_issues_repo_head_hexsha
dtype: string
- name: max_issues_repo_licenses
sequence: string
- name: max_issues_count
dtype: int64
- name: max_issues_repo_issues_event_min_datetime
dtype: string
- name: max_issues_repo_issues_event_max_datetime
dtype: string
- name: max_forks_repo_path
dtype: string
- name: max_forks_repo_name
dtype: string
- name: max_forks_repo_head_hexsha
dtype: string
- name: max_forks_repo_licenses
sequence: string
- name: max_forks_count
dtype: int64
- name: max_forks_repo_forks_event_min_datetime
dtype: string
- name: max_forks_repo_forks_event_max_datetime
dtype: string
- name: content
dtype: string
- name: avg_line_length
dtype: float64
- name: max_line_length
dtype: int64
- name: alphanum_fraction
dtype: float64
- name: original_content
dtype: string
- name: filtered:remove_decorators
dtype: int64
- name: filtered:remove_async
dtype: int64
- name: filtered:remove_generators
dtype: int64
- name: filtered:remove_delete_markers
dtype: int64
splits:
- name: train
num_bytes: 128851530736
num_examples: 12960052
download_size: 49703710584
dataset_size: 128851530736
configs:
- config_name: main
data_files:
- split: train
path: main/train-*
---
This is a copy of [bigcode/the-stack-dedup](https://huggingface.co/datasets/bigcode/the-stack-dedup) with some filters applied.
The filters filtered in this dataset are:
- remove_decorators
- remove_async
- remove_generators
- remove_delete_markers
|
aneeshas/imsdb-500tokenhorror-movie-scripts | ---
dataset_info:
features:
- name: Horror
dtype: string
splits:
- name: train
num_bytes: 76307
num_examples: 158
download_size: 50645
dataset_size: 76307
---
# Dataset Card for "imsdb-500tokenhorror-movie-scripts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-nutrition-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 9632
num_examples: 5
- name: test
num_bytes: 3616646
num_examples: 306
download_size: 250211
dataset_size: 3626278
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-nutrition-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
minh21/COVID-QA-testset-biencoder-data-65_25_10 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: context_chunks
sequence: string
- name: document_id
dtype: int64
- name: id
dtype: int64
- name: context
dtype: string
splits:
- name: train
num_bytes: 16708455
num_examples: 201
download_size: 442083
dataset_size: 16708455
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "COVID-QA-testset-biencoder-data-65_25_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rakshit122/zaa11 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: string
splits:
- name: train
num_bytes: 46270
num_examples: 226
download_size: 16707
dataset_size: 46270
---
# Dataset Card for "zaa11"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eDsny/lukas | ---
license: openrail
---
|
open-llm-leaderboard/details_sartmis1__starcoder-finetune-selfinstruct | ---
pretty_name: Evaluation run of sartmis1/starcoder-finetune-selfinstruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sartmis1/starcoder-finetune-selfinstruct](https://huggingface.co/sartmis1/starcoder-finetune-selfinstruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sartmis1__starcoder-finetune-selfinstruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T09:06:26.158683](https://huggingface.co/datasets/open-llm-leaderboard/details_sartmis1__starcoder-finetune-selfinstruct/blob/main/results_2023-09-23T09-06-26.158683.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.00036305608931189545,\n \"f1\": 0.04220742449664442,\n\
\ \"f1_stderr\": 0.0011048606881245398,\n \"acc\": 0.31919735419373096,\n\
\ \"acc_stderr\": 0.01022815770603217\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931189545,\n\
\ \"f1\": 0.04220742449664442,\n \"f1_stderr\": 0.0011048606881245398\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.060652009097801364,\n \
\ \"acc_stderr\": 0.0065747333814057925\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5777426992896606,\n \"acc_stderr\": 0.013881582030658549\n\
\ }\n}\n```"
repo_url: https://huggingface.co/sartmis1/starcoder-finetune-selfinstruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T09_06_26.158683
path:
- '**/details_harness|drop|3_2023-09-23T09-06-26.158683.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T09-06-26.158683.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T09_06_26.158683
path:
- '**/details_harness|gsm8k|5_2023-09-23T09-06-26.158683.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T09-06-26.158683.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T09_06_26.158683
path:
- '**/details_harness|winogrande|5_2023-09-23T09-06-26.158683.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T09-06-26.158683.parquet'
- config_name: results
data_files:
- split: 2023_09_23T09_06_26.158683
path:
- results_2023-09-23T09-06-26.158683.parquet
- split: latest
path:
- results_2023-09-23T09-06-26.158683.parquet
---
# Dataset Card for Evaluation run of sartmis1/starcoder-finetune-selfinstruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/sartmis1/starcoder-finetune-selfinstruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [sartmis1/starcoder-finetune-selfinstruct](https://huggingface.co/sartmis1/starcoder-finetune-selfinstruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sartmis1__starcoder-finetune-selfinstruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T09:06:26.158683](https://huggingface.co/datasets/open-llm-leaderboard/details_sartmis1__starcoder-finetune-selfinstruct/blob/main/results_2023-09-23T09-06-26.158683.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931189545,
"f1": 0.04220742449664442,
"f1_stderr": 0.0011048606881245398,
"acc": 0.31919735419373096,
"acc_stderr": 0.01022815770603217
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931189545,
"f1": 0.04220742449664442,
"f1_stderr": 0.0011048606881245398
},
"harness|gsm8k|5": {
"acc": 0.060652009097801364,
"acc_stderr": 0.0065747333814057925
},
"harness|winogrande|5": {
"acc": 0.5777426992896606,
"acc_stderr": 0.013881582030658549
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
juniorsamp/mcdaniel | ---
license: openrail
---
|
temasarkisov/SolidLogosID_converted_processed_V1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1029877.0
num_examples: 48
download_size: 1030924
dataset_size: 1029877.0
---
# Dataset Card for "SolidLogosID_converted_processed_V1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bdsaglam/musique-jerx-sft | ---
dataset_info:
features:
- name: text
dtype: string
- name: triplets
sequence: string
splits:
- name: train
num_bytes: 85382
num_examples: 110
download_size: 39095
dataset_size: 85382
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_WizardLM__WizardCoder-15B-V1.0 | ---
pretty_name: Evaluation run of WizardLM/WizardCoder-15B-V1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [WizardLM/WizardCoder-15B-V1.0](https://huggingface.co/WizardLM/WizardCoder-15B-V1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WizardLM__WizardCoder-15B-V1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T05:39:03.080415](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardCoder-15B-V1.0/blob/main/results_2023-10-18T05-39-03.080415.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.10171979865771812,\n\
\ \"em_stderr\": 0.003095624755865799,\n \"f1\": 0.1654624580536908,\n\
\ \"f1_stderr\": 0.0033020160713134682,\n \"acc\": 0.2864625625234491,\n\
\ \"acc_stderr\": 0.008973810218487487\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.10171979865771812,\n \"em_stderr\": 0.003095624755865799,\n\
\ \"f1\": 0.1654624580536908,\n \"f1_stderr\": 0.0033020160713134682\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02122820318423048,\n \
\ \"acc_stderr\": 0.003970449129848635\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5516969218626677,\n \"acc_stderr\": 0.013977171307126338\n\
\ }\n}\n```"
repo_url: https://huggingface.co/WizardLM/WizardCoder-15B-V1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|arc:challenge|25_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T05_39_03.080415
path:
- '**/details_harness|drop|3_2023-10-18T05-39-03.080415.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T05-39-03.080415.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T05_39_03.080415
path:
- '**/details_harness|gsm8k|5_2023-10-18T05-39-03.080415.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T05-39-03.080415.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hellaswag|10_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T20:24:20.327625.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T20:24:20.327625.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T20:24:20.327625.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T05_39_03.080415
path:
- '**/details_harness|winogrande|5_2023-10-18T05-39-03.080415.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T05-39-03.080415.parquet'
- config_name: results
data_files:
- split: 2023_07_19T20_24_20.327625
path:
- results_2023-07-19T20:24:20.327625.parquet
- split: 2023_10_18T05_39_03.080415
path:
- results_2023-10-18T05-39-03.080415.parquet
- split: latest
path:
- results_2023-10-18T05-39-03.080415.parquet
---
# Dataset Card for Evaluation run of WizardLM/WizardCoder-15B-V1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WizardLM/WizardCoder-15B-V1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [WizardLM/WizardCoder-15B-V1.0](https://huggingface.co/WizardLM/WizardCoder-15B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WizardLM__WizardCoder-15B-V1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T05:39:03.080415](https://huggingface.co/datasets/open-llm-leaderboard/details_WizardLM__WizardCoder-15B-V1.0/blob/main/results_2023-10-18T05-39-03.080415.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.10171979865771812,
"em_stderr": 0.003095624755865799,
"f1": 0.1654624580536908,
"f1_stderr": 0.0033020160713134682,
"acc": 0.2864625625234491,
"acc_stderr": 0.008973810218487487
},
"harness|drop|3": {
"em": 0.10171979865771812,
"em_stderr": 0.003095624755865799,
"f1": 0.1654624580536908,
"f1_stderr": 0.0033020160713134682
},
"harness|gsm8k|5": {
"acc": 0.02122820318423048,
"acc_stderr": 0.003970449129848635
},
"harness|winogrande|5": {
"acc": 0.5516969218626677,
"acc_stderr": 0.013977171307126338
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
breno30/Karlos | ---
license: openrail
---
|
suseongkim87/testCodeDataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7786
num_examples: 32
download_size: 4172
dataset_size: 7786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yashraizad/yelp-open-dataset-top-reviews | ---
license: apache-2.0
---
|
DucHaiten/live-action | ---
license: openrail
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/077f7b8d | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1331
dataset_size: 178
---
# Dataset Card for "077f7b8d"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wenzhuoliu/dataset-long-context-for-e5-finetune | ---
dataset_info:
- config_name: Testset
features:
- name: query
dtype: string
- name: passage
dtype: string
splits:
- name: philosophie
num_bytes: 607946
num_examples: 216
download_size: 352564
dataset_size: 607946
- config_name: testset
features:
- name: query
dtype: string
- name: passage
dtype: string
splits:
- name: llm_wikitexts
num_bytes: 355087
num_examples: 99
- name: llm_wiki_single_long_document
num_bytes: 367297
num_examples: 148
download_size: 467511
dataset_size: 722384
- config_name: trainset
features:
- name: query
dtype: string
- name: passage
dtype: string
splits:
- name: wikihow_summary_passage
num_bytes: 332619989
num_examples: 111637
- name: llm_generated_question_passage
num_bytes: 74929318
num_examples: 20000
- name: qestion_passage_fr
num_bytes: 18560943
num_examples: 20535
download_size: 243783107
dataset_size: 426110250
configs:
- config_name: Testset
data_files:
- split: philosophie
path: Testset/philosophie-*
- config_name: testset
data_files:
- split: llm_wikitexts
path: llm_eval/train-*
- split: single_document
path: llm_eval/single_document-*
- config_name: trainset
data_files:
- split: wikihow_summary_passage
path: data/wikihow_summary_passage-*
- split: llm_generated_question_passage
path: data/llm_generated_question_passage-*
- split: qestion_passage_fr
path: data/qestion_passage_fr-*
---
# Dataset Card for "dataset-long-context-for-e5-finetune"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lodrick-the-lafted/Hermes-100K | ---
language:
- eng
pretty_name: Hermes-100K
tags:
- distillation
- synthetic data
- gpt
task_categories:
- text-generation
---
It's 100K rows sampled from teknium/openhermes (not the newer 2.5).
Filtered some GPTisms I dislike out, and removed rows with short output as well to bias towards longer answers.
bad_phrases = ["couldn't help but", "can't resist", "random", "unethical", "I'm sorry, but", "I'm sorry but", "as an AI", "as a Language Model", "AI Language Model", "language model", "However, it is important to", "However, it's important", "ethical guidelines", "just an AI", "within my programming", "illegal", "cannot provide"] |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/720c5d3f | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1341
dataset_size: 182
---
# Dataset Card for "720c5d3f"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
clue2solve/langchain-docs-guides | ---
license: apache-2.0
language:
- en
tags:
- langchain
- langchain-docs
- langchain-docs-guides
pretty_name: 'Langchain Docs - Guides '
size_categories:
- 1K<n<10K
--- |
open-llm-leaderboard/details_ehartford__Samantha-1.11-CodeLlama-34b | ---
pretty_name: Evaluation run of ehartford/Samantha-1.11-CodeLlama-34b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/Samantha-1.11-CodeLlama-34b](https://huggingface.co/ehartford/Samantha-1.11-CodeLlama-34b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__Samantha-1.11-CodeLlama-34b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T11:44:05.953431](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.11-CodeLlama-34b/blob/main/results_2023-10-22T11-44-05.953431.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03460570469798658,\n\
\ \"em_stderr\": 0.0018718276753995871,\n \"f1\": 0.0896822567114094,\n\
\ \"f1_stderr\": 0.002178835707458538,\n \"acc\": 0.464067454416748,\n\
\ \"acc_stderr\": 0.011642141344686711\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.03460570469798658,\n \"em_stderr\": 0.0018718276753995871,\n\
\ \"f1\": 0.0896822567114094,\n \"f1_stderr\": 0.002178835707458538\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19332827899924185,\n \
\ \"acc_stderr\": 0.010877733223180565\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.01240654946619286\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ehartford/Samantha-1.11-CodeLlama-34b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|arc:challenge|25_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T11_44_05.953431
path:
- '**/details_harness|drop|3_2023-10-22T11-44-05.953431.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T11-44-05.953431.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T11_44_05.953431
path:
- '**/details_harness|gsm8k|5_2023-10-22T11-44-05.953431.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T11-44-05.953431.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hellaswag|10_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T02:57:56.123943.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T02:57:56.123943.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T02:57:56.123943.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T11_44_05.953431
path:
- '**/details_harness|winogrande|5_2023-10-22T11-44-05.953431.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T11-44-05.953431.parquet'
- config_name: results
data_files:
- split: 2023_08_26T02_57_56.123943
path:
- results_2023-08-26T02:57:56.123943.parquet
- split: 2023_10_22T11_44_05.953431
path:
- results_2023-10-22T11-44-05.953431.parquet
- split: latest
path:
- results_2023-10-22T11-44-05.953431.parquet
---
# Dataset Card for Evaluation run of ehartford/Samantha-1.11-CodeLlama-34b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/Samantha-1.11-CodeLlama-34b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/Samantha-1.11-CodeLlama-34b](https://huggingface.co/ehartford/Samantha-1.11-CodeLlama-34b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__Samantha-1.11-CodeLlama-34b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T11:44:05.953431](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.11-CodeLlama-34b/blob/main/results_2023-10-22T11-44-05.953431.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.03460570469798658,
"em_stderr": 0.0018718276753995871,
"f1": 0.0896822567114094,
"f1_stderr": 0.002178835707458538,
"acc": 0.464067454416748,
"acc_stderr": 0.011642141344686711
},
"harness|drop|3": {
"em": 0.03460570469798658,
"em_stderr": 0.0018718276753995871,
"f1": 0.0896822567114094,
"f1_stderr": 0.002178835707458538
},
"harness|gsm8k|5": {
"acc": 0.19332827899924185,
"acc_stderr": 0.010877733223180565
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.01240654946619286
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Shikshya/revised_tyaani_jwellery_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 108904.0
num_examples: 9
download_size: 104747
dataset_size: 108904.0
---
# Dataset Card for "revised_tyaani_jwellery_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/scylla_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of scylla/シラ/斯库拉 (Azur Lane)
This is the dataset of scylla/シラ/斯库拉 (Azur Lane), containing 89 images and their tags.
The core tags of this character are `long_hair, breasts, red_eyes, bangs, large_breasts, white_hair, very_long_hair, hairband, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 89 | 150.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scylla_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 89 | 70.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scylla_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 224 | 155.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scylla_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 89 | 125.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scylla_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 224 | 239.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scylla_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/scylla_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, solo, looking_at_viewer, white_shirt, black_headwear, long_sleeves, smile, beret, blush, collared_shirt, black_jacket, black_skirt, pleated_skirt, black_pantyhose, closed_mouth, off_shoulder, cleavage, holding |
| 1 | 37 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, white_dress, solo, elbow_gloves, white_gloves, blush, smile, frills, simple_background, white_background, closed_mouth, hair_flower |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | white_shirt | black_headwear | long_sleeves | smile | beret | blush | collared_shirt | black_jacket | black_skirt | pleated_skirt | black_pantyhose | closed_mouth | off_shoulder | cleavage | holding | white_dress | elbow_gloves | white_gloves | frills | simple_background | white_background | hair_flower |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------------|:-----------------|:---------------|:--------|:--------|:--------|:-----------------|:---------------|:--------------|:----------------|:------------------|:---------------|:---------------|:-----------|:----------|:--------------|:---------------|:---------------|:---------|:--------------------|:-------------------|:--------------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 37 |  |  |  |  |  | X | X | X | | | | X | | X | | | | | | X | | X | | X | X | X | X | X | X | X |
|
meerlubna/NewGPT2Dataset | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 17924
num_examples: 79
download_size: 9894
dataset_size: 17924
---
# Dataset Card for "NewGPT2Dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arianhosseini/summarize_dpo1b1_ngen10_20k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 35964940
num_examples: 20000
download_size: 20633481
dataset_size: 35964940
---
# Dataset Card for "summarize_dpo1b1_ngen10_20k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/crow_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of crow/クロウ/克拉乌/크로우 (Nikke: Goddess of Victory)
This is the dataset of crow/クロウ/克拉乌/크로우 (Nikke: Goddess of Victory), containing 27 images and their tags.
The core tags of this character are `black_hair, breasts, short_hair, green_eyes, multicolored_hair, red_hair, large_breasts, streaked_hair, ear_piercing`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 27 | 43.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/crow_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 27 | 21.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/crow_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 66 | 46.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/crow_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 27 | 36.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/crow_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 66 | 69.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/crow_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/crow_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------|
| 0 | 27 |  |  |  |  |  | 1girl, looking_at_viewer, piercing, tattoo, jewelry, solo, cleavage, black_choker, bare_shoulders, jacket, navel |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | piercing | tattoo | jewelry | solo | cleavage | black_choker | bare_shoulders | jacket | navel |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-----------|:---------|:----------|:-------|:-----------|:---------------|:-----------------|:---------|:--------|
| 0 | 27 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
|
fnlp/moss-003-sft-data | ---
license: cc-by-4.0
---
# moss-003-sft-data
## Conversation Without Plugins
### Categories
| Category | \# samples |
|----------------------|-----------:|
| Brainstorming | 99,162 |
| Complex Instruction | 95,574 |
| Code | 198,079 |
| Role Playing | 246,375 |
| Writing | 341,087 |
| Harmless | 74,573 |
| Others | 19,701 |
| Total | 1,074,551 |
**Others** contains two categories: **Continue**(9,839) and **Switching**(9,862).
The **Continue** category refers to instances in a conversation where the user asks the system to continue outputting the response from the previous round that was not completed.
The **Switching** category refers to instances in a conversation where the user switches the language they are using.
We remove the data for honesty because it contains private information.
|
Megalar/qa1 | ---
license: apache-2.0
---
|
Organika/StackStar_subreddits | ---
dataset_info:
features:
- name: post_ID
dtype: string
- name: subreddit
dtype: string
- name: author
dtype: string
- name: datestamp
dtype: float64
- name: comment_ID
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 355348.3325415677
num_examples: 336
- name: test
num_bytes: 89894.6674584323
num_examples: 85
download_size: 270497
dataset_size: 445243.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
fredaa/test5 | ---
license: mit
---
|
CyberHarem/yanase_miyuki_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yanase_miyuki/柳瀬美由紀 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of yanase_miyuki/柳瀬美由紀 (THE iDOLM@STER: Cinderella Girls), containing 51 images and their tags.
The core tags of this character are `brown_hair, short_hair, hair_ornament, yellow_eyes, side_ponytail, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 51 | 53.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yanase_miyuki_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 51 | 33.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yanase_miyuki_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 112 | 65.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yanase_miyuki_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 51 | 47.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yanase_miyuki_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 112 | 89.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yanase_miyuki_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yanase_miyuki_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------|
| 0 | 51 |  |  |  |  |  | 1girl, open_mouth, solo, blush, looking_at_viewer, :d |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | open_mouth | solo | blush | looking_at_viewer | :d |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:-------|:--------|:--------------------|:-----|
| 0 | 51 |  |  |  |  |  | X | X | X | X | X | X |
|
tollefj/rettsavgjoerelser_100samples_embeddings | ---
dataset_info:
features:
- name: url
dtype: string
- name: keywords
sequence: string
- name: text
dtype: string
- name: sentences
sequence: string
- name: summary
sequence: string
- name: embedding
sequence:
sequence: float32
splits:
- name: train
num_bytes: 73887305
num_examples: 100
download_size: 71145367
dataset_size: 73887305
language:
- 'no'
---
# Dataset Card for "rettsavgjoerelser_100samples_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-kmfoda__booksum-kmfoda__booksum-9d5680-2758781772 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- kmfoda/booksum
eval_info:
task: summarization
model: pszemraj/tglobal-large-booksum-WIP4-r1
metrics: []
dataset_name: kmfoda/booksum
dataset_config: kmfoda--booksum
dataset_split: test
col_mapping:
text: chapter
target: summary_text
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/tglobal-large-booksum-WIP4-r1
* Dataset: kmfoda/booksum
* Config: kmfoda--booksum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
autoevaluate/autoeval-staging-eval-project-Blaise-g__SumPubmed-f53a4404-12415653 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- Blaise-g/SumPubmed
eval_info:
task: summarization
model: Blaise-g/led-large-sumpubmed
metrics: []
dataset_name: Blaise-g/SumPubmed
dataset_config: Blaise-g--SumPubmed
dataset_split: test
col_mapping:
text: text
target: abstract
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: Blaise-g/led-large-sumpubmed
* Dataset: Blaise-g/SumPubmed
* Config: Blaise-g--SumPubmed
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Blaise-g](https://huggingface.co/Blaise-g) for evaluating this model. |
VishwaSharma84/stack_overflow_data | ---
license: openrail
---
|
jtatman/medical-sci-instruct-1m-sharegpt | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 1081196346
num_examples: 1255224
download_size: 556896933
dataset_size: 1081196346
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "medical-sci-instruct-1m-sharegpt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibranze/araproje_hellaswag_en_conf_mgpt_worstscore | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 149738.0
num_examples: 250
download_size: 0
dataset_size: 149738.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_en_conf_mgpt_worstscore"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ohhhchank3/VONHUYdemo | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 35331137
num_examples: 40000
download_size: 20434360
dataset_size: 35331137
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Faradaylab__ARIA-70B-V2 | ---
pretty_name: Evaluation run of Faradaylab/ARIA-70B-V2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Faradaylab/ARIA-70B-V2](https://huggingface.co/Faradaylab/ARIA-70B-V2) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Faradaylab__ARIA-70B-V2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T19:48:19.078343](https://huggingface.co/datasets/open-llm-leaderboard/details_Faradaylab__ARIA-70B-V2/blob/main/results_2023-10-25T19-48-19.078343.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.12174916107382551,\n\
\ \"em_stderr\": 0.0033487438315364985,\n \"f1\": 0.18035549496644265,\n\
\ \"f1_stderr\": 0.0034191831504093964,\n \"acc\": 0.552493667621485,\n\
\ \"acc_stderr\": 0.011672124185183144\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.12174916107382551,\n \"em_stderr\": 0.0033487438315364985,\n\
\ \"f1\": 0.18035549496644265,\n \"f1_stderr\": 0.0034191831504093964\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2880970432145565,\n \
\ \"acc_stderr\": 0.012474469737197917\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8168902920284136,\n \"acc_stderr\": 0.01086977863316837\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Faradaylab/ARIA-70B-V2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|arc:challenge|25_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T19_48_19.078343
path:
- '**/details_harness|drop|3_2023-10-25T19-48-19.078343.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T19-48-19.078343.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T19_48_19.078343
path:
- '**/details_harness|gsm8k|5_2023-10-25T19-48-19.078343.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T19-48-19.078343.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hellaswag|10_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T05-14-04.383698.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T05-14-04.383698.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-14T05-14-04.383698.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T19_48_19.078343
path:
- '**/details_harness|winogrande|5_2023-10-25T19-48-19.078343.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T19-48-19.078343.parquet'
- config_name: results
data_files:
- split: 2023_09_14T05_14_04.383698
path:
- results_2023-09-14T05-14-04.383698.parquet
- split: 2023_10_25T19_48_19.078343
path:
- results_2023-10-25T19-48-19.078343.parquet
- split: latest
path:
- results_2023-10-25T19-48-19.078343.parquet
---
# Dataset Card for Evaluation run of Faradaylab/ARIA-70B-V2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Faradaylab/ARIA-70B-V2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Faradaylab/ARIA-70B-V2](https://huggingface.co/Faradaylab/ARIA-70B-V2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Faradaylab__ARIA-70B-V2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T19:48:19.078343](https://huggingface.co/datasets/open-llm-leaderboard/details_Faradaylab__ARIA-70B-V2/blob/main/results_2023-10-25T19-48-19.078343.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.12174916107382551,
"em_stderr": 0.0033487438315364985,
"f1": 0.18035549496644265,
"f1_stderr": 0.0034191831504093964,
"acc": 0.552493667621485,
"acc_stderr": 0.011672124185183144
},
"harness|drop|3": {
"em": 0.12174916107382551,
"em_stderr": 0.0033487438315364985,
"f1": 0.18035549496644265,
"f1_stderr": 0.0034191831504093964
},
"harness|gsm8k|5": {
"acc": 0.2880970432145565,
"acc_stderr": 0.012474469737197917
},
"harness|winogrande|5": {
"acc": 0.8168902920284136,
"acc_stderr": 0.01086977863316837
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
mHossain/final_train_v2_360000 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 9145644.3
num_examples: 27000
- name: test
num_bytes: 1016182.7
num_examples: 3000
download_size: 4459398
dataset_size: 10161827.0
---
# Dataset Card for "final_train_v2_360000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vilm/the-stack-smol-xl-cleaned | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1147766215
num_examples: 205173
download_size: 368132773
dataset_size: 1147766215
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# The Stack smol-XL
A cleaned version of The Stack smol-XL |
Will-uob/147_Spectrogram_labels | ---
license: gpl-3.0
---
|
space-sue/storydb | ---
license: cc-by-4.0
---
|
suolyer/pile_wikipedia | ---
license: apache-2.0
---
|
rdpahalavan/packet-tag-explanation | ---
license: apache-2.0
tags:
- Network Intrusion Detection
- Cybersecurity
- Network Packets
size_categories:
- 100K<n<1M
language:
- en
---
This dataset contains the packet information and the tags and their corresponding explanation. For more information, [visit here](https://github.com/rdpahalavan/nids-transformers). |
open-llm-leaderboard/details_Walmart-the-bag__Quintellect-10.7B | ---
pretty_name: Evaluation run of Walmart-the-bag/Quintellect-10.7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Walmart-the-bag/Quintellect-10.7B](https://huggingface.co/Walmart-the-bag/Quintellect-10.7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Walmart-the-bag__Quintellect-10.7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T01:12:27.024632](https://huggingface.co/datasets/open-llm-leaderboard/details_Walmart-the-bag__Quintellect-10.7B/blob/main/results_2024-03-22T01-12-27.024632.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6350567544191552,\n\
\ \"acc_stderr\": 0.03248520404869693,\n \"acc_norm\": 0.6366662196102063,\n\
\ \"acc_norm_stderr\": 0.03314087543508369,\n \"mc1\": 0.4320685434516524,\n\
\ \"mc1_stderr\": 0.01734120239498826,\n \"mc2\": 0.5956588813281548,\n\
\ \"mc2_stderr\": 0.015586776670525026\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6203071672354948,\n \"acc_stderr\": 0.014182119866974872,\n\
\ \"acc_norm\": 0.6501706484641638,\n \"acc_norm_stderr\": 0.0139368092121583\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6539533957379008,\n\
\ \"acc_stderr\": 0.004747360500742478,\n \"acc_norm\": 0.8447520414260108,\n\
\ \"acc_norm_stderr\": 0.003614007841341994\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072388,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072388\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473075,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.01646534546739152,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.01646534546739152\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381392,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381392\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4346368715083799,\n\
\ \"acc_stderr\": 0.016578997435496713,\n \"acc_norm\": 0.4346368715083799,\n\
\ \"acc_norm_stderr\": 0.016578997435496713\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399662,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399662\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579921,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579921\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4517601043024772,\n\
\ \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.4517601043024772,\n\
\ \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254187,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254187\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.019291961895066375,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.019291961895066375\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4320685434516524,\n\
\ \"mc1_stderr\": 0.01734120239498826,\n \"mc2\": 0.5956588813281548,\n\
\ \"mc2_stderr\": 0.015586776670525026\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6057619408642911,\n \
\ \"acc_stderr\": 0.013460852357095661\n }\n}\n```"
repo_url: https://huggingface.co/Walmart-the-bag/Quintellect-10.7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|arc:challenge|25_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|gsm8k|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hellaswag|10_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T01-12-27.024632.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T01-12-27.024632.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- '**/details_harness|winogrande|5_2024-03-22T01-12-27.024632.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T01-12-27.024632.parquet'
- config_name: results
data_files:
- split: 2024_03_22T01_12_27.024632
path:
- results_2024-03-22T01-12-27.024632.parquet
- split: latest
path:
- results_2024-03-22T01-12-27.024632.parquet
---
# Dataset Card for Evaluation run of Walmart-the-bag/Quintellect-10.7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Walmart-the-bag/Quintellect-10.7B](https://huggingface.co/Walmart-the-bag/Quintellect-10.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Walmart-the-bag__Quintellect-10.7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T01:12:27.024632](https://huggingface.co/datasets/open-llm-leaderboard/details_Walmart-the-bag__Quintellect-10.7B/blob/main/results_2024-03-22T01-12-27.024632.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6350567544191552,
"acc_stderr": 0.03248520404869693,
"acc_norm": 0.6366662196102063,
"acc_norm_stderr": 0.03314087543508369,
"mc1": 0.4320685434516524,
"mc1_stderr": 0.01734120239498826,
"mc2": 0.5956588813281548,
"mc2_stderr": 0.015586776670525026
},
"harness|arc:challenge|25": {
"acc": 0.6203071672354948,
"acc_stderr": 0.014182119866974872,
"acc_norm": 0.6501706484641638,
"acc_norm_stderr": 0.0139368092121583
},
"harness|hellaswag|10": {
"acc": 0.6539533957379008,
"acc_stderr": 0.004747360500742478,
"acc_norm": 0.8447520414260108,
"acc_norm_stderr": 0.003614007841341994
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462456,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462456
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072388,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072388
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473075,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.01646534546739152,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.01646534546739152
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381392,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381392
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4346368715083799,
"acc_stderr": 0.016578997435496713,
"acc_norm": 0.4346368715083799,
"acc_norm_stderr": 0.016578997435496713
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292452,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399662,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399662
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579921,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579921
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4517601043024772,
"acc_stderr": 0.012710662233660247,
"acc_norm": 0.4517601043024772,
"acc_norm_stderr": 0.012710662233660247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254187,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254187
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.019291961895066375,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.019291961895066375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4320685434516524,
"mc1_stderr": 0.01734120239498826,
"mc2": 0.5956588813281548,
"mc2_stderr": 0.015586776670525026
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.01144628062926263
},
"harness|gsm8k|5": {
"acc": 0.6057619408642911,
"acc_stderr": 0.013460852357095661
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
fabiochiu/doge | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 451322.0
num_examples: 5
download_size: 451958
dataset_size: 451322.0
---
# Dataset Card for "doge"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
clu-ling/clupubhealth | ---
license: apache-2.0
task_categories:
- summarization
language:
- en
tags:
- medical
size_categories:
- 1K<n<10K
- 10K<n<100K
---
# `clupubhealth`
The `CLUPubhealth` dataset is based on the [PUBHEALTH fact-checking dataset](https://github.com/neemakot/Health-Fact-Checking).
The PUBHEALTH dataset contains claims, explanations, and main texts. The explanations function as vetted summaries of the main texts. The CLUPubhealth dataset repurposes these fields into summaries and texts for use in training Summarization models such as Facebook's BART.
There are currently 4 dataset configs which can be called, each has three splits (see Usage):
### `clupubhealth/mini`
This config includes only 200 samples per split. This is mostly used in testing scripts when small sets are desirable.
### `clupubhealth/base`
This is the base dataset which includes the full PUBHEALTH set, sans False samples. The `test` split is a shortened version which includes only 200 samples. This allows for faster eval steps during trianing.
### `clupubhealth/expanded`
Where the base `train` split contains 5,078 data points, this expanded set includes 62,163 data points. ChatGPT was used to generate new versions of the summaries in the base set. After GPT expansion a total of 72,498 were generated, however, this was shortened to ~62k after samples with poor BERTScores were eliminated.
### `clupubhealth/test`
This config has the full `test` split with ~1200 samples. Used for post-training evaluation.
## USAGE
To use the CLUPubhealth dataset use the `datasets` library:
```python
from datasets import load_dataset
data = load_dataset("clu-ling/clupubhealth", "base")
# Where the accepted extensions are the configs: `mini`, `base`, `expanded`, `test`
``` |
arbml/Arabic_Hate_Speech | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet
dtype: string
- name: is_off
dtype: string
- name: is_hate
dtype: string
- name: is_vlg
dtype: string
- name: is_vio
dtype: string
splits:
- name: train
num_bytes: 1656540
num_examples: 8557
- name: validation
num_bytes: 234165
num_examples: 1266
download_size: 881261
dataset_size: 1890705
---
# Dataset Card for "Arabic_Hate_Speech"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jerpint/rock-paper-scissors | ---
license: cc-by-2.0
---
Clone of the dataset from https://laurencemoroney.com/datasets.html
|
navanchauhan/decimer-data-mini | ---
license: openrail
pretty_name: PubChem 68K
size_categories:
- 10K<n<100K
task_categories:
- image-to-text
dataset_info:
features:
- name: image
dtype: image
- name: smiles
dtype: string
- name: selfies
dtype: string
splits:
- name: train
num_bytes: 1185846198.576
num_examples: 68996
- name: test
num_bytes: 267097779.576
num_examples: 15499
- name: validation
num_bytes: 266912227.912
num_examples: 15499
download_size: 1692942822
dataset_size: 1719856206.064
---
Molecules in this set
* have a molecular weight of fewer than 1500 Daltons,
* not possess counter ions,
* only contain the elements C, H, O, N, P, S, F, Cl, Br, I, Se and B,
* not contain isotopes of Hydrogens (D, T),
* have 3–40 bonds,
* not contain any charged groups including zwitterionic forms,
* only contain implicit hydrogens, except in functional groups,
* have less than 40 SMILES characters,
* no stereochemistry is allowed.
The original dataset from Decimer was imported and randomly sampled. 516x516 sized images were generated using RDKit.
## Reference
> Rajan, Kohulan; Zielesny, Achim; Steinbeck, Christoph (2021): DECIMER 1.0: Deep Learning for Chemical Image Recognition using Transformers. ChemRxiv. Preprint. https://doi.org/10.26434/chemrxiv.14479287.v1 |
prasoonskrishnan/movie_recomendation | ---
license: afl-3.0
---
|
CorpuSlave/MRC | ---
license: cc-by-nc-sa-4.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
- name: doc_id
dtype: string
splits:
- name: train
num_bytes: 804002756
num_examples: 486812
download_size: 424720173
dataset_size: 804002756
---
|
shivam9980/Inshorts-english | ---
license: apache-2.0
---
|
Locutusque/function-calling-chatml | ---
dataset_info:
features:
- name: system_message
dtype: string
- name: function_description
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 311913135
num_examples: 112960
download_size: 107035875
dataset_size: 311913135
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "function-calling-chatml"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-moral_scenarios-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 11487
num_examples: 5
- name: test
num_bytes: 11181249
num_examples: 895
download_size: 574740
dataset_size: 11192736
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-moral_scenarios-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nguyenvulebinh/asr-alignment | ---
license: apache-2.0
dataset_info:
- config_name: commonvoice
features:
- name: id
dtype: string
- name: text
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: words
sequence: string
- name: word_start
sequence: float64
- name: word_end
sequence: float64
- name: entity_start
sequence: int64
- name: entity_end
sequence: int64
- name: entity_label
sequence: string
splits:
- name: train
num_bytes: 43744079378.659
num_examples: 948733
- name: valid
num_bytes: 722372503.994
num_examples: 16353
download_size: 39798988113
dataset_size: 44466451882.653
- config_name: gigaspeech
features:
- name: id
dtype: string
- name: text
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: words
sequence: string
- name: word_start
sequence: float64
- name: word_end
sequence: float64
- name: entity_start
sequence: int64
- name: entity_end
sequence: int64
- name: entity_label
sequence: string
splits:
- name: train
num_bytes: 1032024261294.48
num_examples: 8282987
- name: valid
num_bytes: 1340974408.04
num_examples: 5715
download_size: 1148966064515
dataset_size: 1033365235702.52
- config_name: libris
features:
- name: id
dtype: string
- name: text
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: words
sequence: string
- name: word_start
sequence: float64
- name: word_end
sequence: float64
- name: entity_start
sequence: int64
- name: entity_end
sequence: int64
- name: entity_label
sequence: string
splits:
- name: train
num_bytes: 63849575890.896
num_examples: 281241
- name: valid
num_bytes: 793442600.643
num_examples: 5559
download_size: 61361142328
dataset_size: 64643018491.539
- config_name: mustc
features:
- name: id
dtype: string
- name: text
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: words
sequence: string
- name: word_start
sequence: float64
- name: word_end
sequence: float64
- name: entity_start
sequence: int64
- name: entity_end
sequence: int64
- name: entity_label
sequence: string
splits:
- name: train
num_bytes: 55552777413.1
num_examples: 248612
- name: valid
num_bytes: 313397447.704
num_examples: 1408
download_size: 52028374666
dataset_size: 55866174860.804
- config_name: tedlium
features:
- name: id
dtype: string
- name: text
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: words
sequence: string
- name: word_start
sequence: float64
- name: word_end
sequence: float64
- name: entity_start
sequence: int64
- name: entity_end
sequence: int64
- name: entity_label
sequence: string
splits:
- name: train
num_bytes: 56248950771.568
num_examples: 268216
- name: valid
num_bytes: 321930549.928
num_examples: 1456
download_size: 52557126451
dataset_size: 56570881321.496
- config_name: voxpopuli
features:
- name: id
dtype: string
- name: text
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: words
sequence: string
- name: word_start
sequence: float64
- name: word_end
sequence: float64
- name: entity_start
sequence: int64
- name: entity_end
sequence: int64
- name: entity_label
sequence: string
splits:
- name: train
num_bytes: 118516424284.524
num_examples: 182463
- name: valid
num_bytes: 1144543020.808
num_examples: 1842
download_size: 98669668241
dataset_size: 119660967305.332
configs:
- config_name: commonvoice
data_files:
- split: train
path: commonvoice/train-*
- split: valid
path: commonvoice/valid-*
- config_name: gigaspeech
data_files:
- split: train
path: gigaspeech/train-*
- split: valid
path: gigaspeech/valid-*
- config_name: libris
data_files:
- split: train
path: libris/train-*
- split: valid
path: libris/valid-*
- config_name: mustc
data_files:
- split: train
path: mustc/train-*
- split: valid
path: mustc/valid-*
- config_name: tedlium
data_files:
- split: train
path: tedlium/train-*
- split: valid
path: tedlium/valid-*
- config_name: voxpopuli
data_files:
- split: train
path: voxpopuli/train-*
- split: valid
path: voxpopuli/valid-*
language:
- en
pretty_name: Speech Recognition Alignment Dataset
size_categories:
- 10M<n<100M
---
# Speech Recognition Alignment Dataset
This dataset is a variation of several widely-used ASR datasets, encompassing Librispeech, MuST-C, TED-LIUM, VoxPopuli, Common Voice, and GigaSpeech. The difference is this dataset includes:
- Precise alignment between audio and text.
- Text that has been punctuated and made case-sensitive.
- Identification of named entities in the text.
# Usage
First, install the latest version of the 🤗 Datasets package:
```bash
pip install --upgrade pip
pip install --upgrade datasets[audio]
```
The dataset can be downloaded and pre-processed on disk using the [`load_dataset`](https://huggingface.co/docs/datasets/v2.14.5/en/package_reference/loading_methods#datasets.load_dataset)
function:
```python
from datasets import load_dataset
# Available dataset: 'libris','mustc','tedlium','voxpopuli','commonvoice','gigaspeech'
dataset = load_dataset("nguyenvulebinh/asr-alignment", "libris")
# take the first sample of the validation set
sample = dataset["train"][0]
```
It can also be streamed directly from the Hub using Datasets' [streaming mode](https://huggingface.co/blog/audio-datasets#streaming-mode-the-silver-bullet).
Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire
dataset to disk:
```python
from datasets import load_dataset
dataset = load_dataset("nguyenvulebinh/asr-alignment", "libris", streaming=True)
# take the first sample of the validation set
sample = next(iter(dataset["train"]))
```
## Citation
If you use this data, please consider citing the [ICASSP 2024 Paper: SYNTHETIC CONVERSATIONS IMPROVE MULTI-TALKER ASR]():
```
@INPROCEEDINGS{synthetic-multi-asr-nguyen,
author={Nguyen, Thai-Binh and Waibel, Alexander},
booktitle={ICASSP 2024 - 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
title={SYNTHETIC CONVERSATIONS IMPROVE MULTI-TALKER ASR},
year={2024},
volume={},
number={},
}
```
## License
This dataset is licensed in accordance with the terms of the original dataset. |
djvictordance/vocal | ---
license: openrail
---
|
freshpearYoon/vr_val_free_6 | ---
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: filename
dtype: string
- name: NumOfUtterance
dtype: int64
- name: text
dtype: string
- name: samplingrate
dtype: int64
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: speaker_id
dtype: string
- name: directory
dtype: string
splits:
- name: train
num_bytes: 7003355378
num_examples: 9105
download_size: 1171175037
dataset_size: 7003355378
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nekofura/zero_one | ---
license: apache-2.0
task_categories:
- text-generation
tags:
- kamen_rider
- zero-one
--- |
ibivibiv/alpaca_lamini17 | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
splits:
- name: train
num_bytes: 56355233
num_examples: 129280
download_size: 36396771
dataset_size: 56355233
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.