datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_0-hero__Matter-0.2-7B | ---
pretty_name: Evaluation run of 0-hero/Matter-0.2-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [0-hero/Matter-0.2-7B](https://huggingface.co/0-hero/Matter-0.2-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_0-hero__Matter-0.2-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-03T01:45:27.142042](https://huggingface.co/datasets/open-llm-leaderboard/details_0-hero__Matter-0.2-7B/blob/main/results_2024-04-03T01-45-27.142042.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6258435998641223,\n\
\ \"acc_stderr\": 0.03245288877827044,\n \"acc_norm\": 0.6283412392485703,\n\
\ \"acc_norm_stderr\": 0.03310716675283535,\n \"mc1\": 0.3353733170134639,\n\
\ \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.481088597087512,\n\
\ \"mc2_stderr\": 0.015055232875750942\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5819112627986348,\n \"acc_stderr\": 0.014413988396996076,\n\
\ \"acc_norm\": 0.6160409556313993,\n \"acc_norm_stderr\": 0.01421244498065189\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6285600477992431,\n\
\ \"acc_stderr\": 0.004822022254886021,\n \"acc_norm\": 0.8239394542919737,\n\
\ \"acc_norm_stderr\": 0.003800932770597754\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03782728980865469,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03782728980865469\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n\
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n\
\ \"acc_stderr\": 0.025091892378859275,\n \"acc_norm\": 0.7354838709677419,\n\
\ \"acc_norm_stderr\": 0.025091892378859275\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.027479603010538808,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.027479603010538808\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723886,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723886\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187208,\n\
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187208\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612896,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612896\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313732,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313732\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822583,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822583\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n\
\ \"acc_stderr\": 0.01428337804429642,\n \"acc_norm\": 0.8007662835249042,\n\
\ \"acc_norm_stderr\": 0.01428337804429642\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247337,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247337\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3865921787709497,\n\
\ \"acc_stderr\": 0.016286674879101022,\n \"acc_norm\": 0.3865921787709497,\n\
\ \"acc_norm_stderr\": 0.016286674879101022\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279056,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279056\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788513,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788513\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195448,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195448\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n\
\ \"acc_stderr\": 0.012733671880342506,\n \"acc_norm\": 0.4621903520208605,\n\
\ \"acc_norm_stderr\": 0.012733671880342506\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n\
\ \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.02971932942241748,\n\
\ \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.02971932942241748\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776348,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776348\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3353733170134639,\n\
\ \"mc1_stderr\": 0.01652753403966899,\n \"mc2\": 0.481088597087512,\n\
\ \"mc2_stderr\": 0.015055232875750942\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.011350315707462059\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5390447308567097,\n \
\ \"acc_stderr\": 0.01373042844911634\n }\n}\n```"
repo_url: https://huggingface.co/0-hero/Matter-0.2-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|arc:challenge|25_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|gsm8k|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hellaswag|10_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T01-45-27.142042.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T01-45-27.142042.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- '**/details_harness|winogrande|5_2024-04-03T01-45-27.142042.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-03T01-45-27.142042.parquet'
- config_name: results
data_files:
- split: 2024_04_03T01_45_27.142042
path:
- results_2024-04-03T01-45-27.142042.parquet
- split: latest
path:
- results_2024-04-03T01-45-27.142042.parquet
---
# Dataset Card for Evaluation run of 0-hero/Matter-0.2-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [0-hero/Matter-0.2-7B](https://huggingface.co/0-hero/Matter-0.2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_0-hero__Matter-0.2-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-03T01:45:27.142042](https://huggingface.co/datasets/open-llm-leaderboard/details_0-hero__Matter-0.2-7B/blob/main/results_2024-04-03T01-45-27.142042.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6258435998641223,
"acc_stderr": 0.03245288877827044,
"acc_norm": 0.6283412392485703,
"acc_norm_stderr": 0.03310716675283535,
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.481088597087512,
"mc2_stderr": 0.015055232875750942
},
"harness|arc:challenge|25": {
"acc": 0.5819112627986348,
"acc_stderr": 0.014413988396996076,
"acc_norm": 0.6160409556313993,
"acc_norm_stderr": 0.01421244498065189
},
"harness|hellaswag|10": {
"acc": 0.6285600477992431,
"acc_stderr": 0.004822022254886021,
"acc_norm": 0.8239394542919737,
"acc_norm_stderr": 0.003800932770597754
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.025091892378859275,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.025091892378859275
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.027479603010538808,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.027479603010538808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723886,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723886
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187208,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187208
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612896,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612896
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313732,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313732
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822583,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822583
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.01428337804429642,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.01428337804429642
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247337,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3865921787709497,
"acc_stderr": 0.016286674879101022,
"acc_norm": 0.3865921787709497,
"acc_norm_stderr": 0.016286674879101022
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279056,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279056
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788513,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788513
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195448,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195448
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4621903520208605,
"acc_stderr": 0.012733671880342506,
"acc_norm": 0.4621903520208605,
"acc_norm_stderr": 0.012733671880342506
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5992647058823529,
"acc_stderr": 0.029768263528933105,
"acc_norm": 0.5992647058823529,
"acc_norm_stderr": 0.029768263528933105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.02971932942241748,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.02971932942241748
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776348,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776348
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3353733170134639,
"mc1_stderr": 0.01652753403966899,
"mc2": 0.481088597087512,
"mc2_stderr": 0.015055232875750942
},
"harness|winogrande|5": {
"acc": 0.7947908445146015,
"acc_stderr": 0.011350315707462059
},
"harness|gsm8k|5": {
"acc": 0.5390447308567097,
"acc_stderr": 0.01373042844911634
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w](https://huggingface.co/CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-12T14:49:01.591870](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w/blob/main/results_2023-10-12T14-49-01.591870.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003145973154362416,\n\
\ \"em_stderr\": 0.0005734993648436388,\n \"f1\": 0.06228817114093964,\n\
\ \"f1_stderr\": 0.0014101371508567083,\n \"acc\": 0.4173053896873633,\n\
\ \"acc_stderr\": 0.009418776710625477\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.003145973154362416,\n \"em_stderr\": 0.0005734993648436388,\n\
\ \"f1\": 0.06228817114093964,\n \"f1_stderr\": 0.0014101371508567083\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06823351023502654,\n \
\ \"acc_stderr\": 0.006945358944067431\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|arc:challenge|25_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_12T14_49_01.591870
path:
- '**/details_harness|drop|3_2023-10-12T14-49-01.591870.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-12T14-49-01.591870.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_12T14_49_01.591870
path:
- '**/details_harness|gsm8k|5_2023-10-12T14-49-01.591870.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-12T14-49-01.591870.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hellaswag|10_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_12T14_49_01.591870
path:
- '**/details_harness|winogrande|5_2023-10-12T14-49-01.591870.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-12T14-49-01.591870.parquet'
- config_name: results
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- results_2023-09-05T10:13:11.603787.parquet
- split: 2023_10_12T14_49_01.591870
path:
- results_2023-10-12T14-49-01.591870.parquet
- split: latest
path:
- results_2023-10-12T14-49-01.591870.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w](https://huggingface.co/CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-12T14:49:01.591870](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w/blob/main/results_2023-10-12T14-49-01.591870.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436388,
"f1": 0.06228817114093964,
"f1_stderr": 0.0014101371508567083,
"acc": 0.4173053896873633,
"acc_stderr": 0.009418776710625477
},
"harness|drop|3": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436388,
"f1": 0.06228817114093964,
"f1_stderr": 0.0014101371508567083
},
"harness|gsm8k|5": {
"acc": 0.06823351023502654,
"acc_stderr": 0.006945358944067431
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183524
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
noe-zaabi/LLM-Science-Standardized | ---
license: mit
---
|
open-llm-leaderboard/details_adamo1139__Mistral-7B-AEZAKMI-v2 | ---
pretty_name: Evaluation run of adamo1139/Mistral-7B-AEZAKMI-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [adamo1139/Mistral-7B-AEZAKMI-v2](https://huggingface.co/adamo1139/Mistral-7B-AEZAKMI-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adamo1139__Mistral-7B-AEZAKMI-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T23:30:41.802824](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Mistral-7B-AEZAKMI-v2/blob/main/results_2024-01-10T23-30-41.802824.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5966405320930094,\n\
\ \"acc_stderr\": 0.03315289936870293,\n \"acc_norm\": 0.6024565187511302,\n\
\ \"acc_norm_stderr\": 0.03382960096382984,\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.01683886288396583,\n \"mc2\": 0.5149993147622676,\n\
\ \"mc2_stderr\": 0.01592337993023178\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5597269624573379,\n \"acc_stderr\": 0.014506769524804237,\n\
\ \"acc_norm\": 0.5810580204778157,\n \"acc_norm_stderr\": 0.014418106953639013\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.635929097789285,\n\
\ \"acc_stderr\": 0.004801852881329736,\n \"acc_norm\": 0.8253335988846843,\n\
\ \"acc_norm_stderr\": 0.0037890554870031834\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.03878139888797611,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.03878139888797611\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n\
\ \"acc_stderr\": 0.037657466938651504,\n \"acc_norm\": 0.5780346820809249,\n\
\ \"acc_norm_stderr\": 0.037657466938651504\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266346,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.03266204299064678,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.03266204299064678\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159788,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7129032258064516,\n\
\ \"acc_stderr\": 0.025736542745594528,\n \"acc_norm\": 0.7129032258064516,\n\
\ \"acc_norm_stderr\": 0.025736542745594528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.031911782267135466,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.031911782267135466\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.02503387058301518,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.02503387058301518\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.025049197876042345,\n\
\ \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.025049197876042345\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.031693802357129965,\n\
\ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.031693802357129965\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443128,\n \"\
acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443128\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.033723432716530645,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.033723432716530645\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.03957835471980979,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.03957835471980979\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724147,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724147\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.02441494730454368,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.02441494730454368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7931034482758621,\n\
\ \"acc_stderr\": 0.014485656041669175,\n \"acc_norm\": 0.7931034482758621,\n\
\ \"acc_norm_stderr\": 0.014485656041669175\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688218,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688218\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28938547486033517,\n\
\ \"acc_stderr\": 0.015166544550490308,\n \"acc_norm\": 0.28938547486033517,\n\
\ \"acc_norm_stderr\": 0.015166544550490308\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824087,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824087\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4348109517601043,\n\
\ \"acc_stderr\": 0.012661233805616292,\n \"acc_norm\": 0.4348109517601043,\n\
\ \"acc_norm_stderr\": 0.012661233805616292\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6078431372549019,\n \"acc_stderr\": 0.019751726508762637,\n \
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.019751726508762637\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\
\ \"acc_stderr\": 0.028996909693328923,\n \"acc_norm\": 0.7860696517412935,\n\
\ \"acc_norm_stderr\": 0.028996909693328923\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653697,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653697\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.01683886288396583,\n \"mc2\": 0.5149993147622676,\n\
\ \"mc2_stderr\": 0.01592337993023178\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7363851617995264,\n \"acc_stderr\": 0.012382849299658457\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3244882486732373,\n \
\ \"acc_stderr\": 0.012896095359768106\n }\n}\n```"
repo_url: https://huggingface.co/adamo1139/Mistral-7B-AEZAKMI-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|arc:challenge|25_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|gsm8k|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hellaswag|10_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T23-30-41.802824.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T23-30-41.802824.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- '**/details_harness|winogrande|5_2024-01-10T23-30-41.802824.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T23-30-41.802824.parquet'
- config_name: results
data_files:
- split: 2024_01_10T23_30_41.802824
path:
- results_2024-01-10T23-30-41.802824.parquet
- split: latest
path:
- results_2024-01-10T23-30-41.802824.parquet
---
# Dataset Card for Evaluation run of adamo1139/Mistral-7B-AEZAKMI-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [adamo1139/Mistral-7B-AEZAKMI-v2](https://huggingface.co/adamo1139/Mistral-7B-AEZAKMI-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_adamo1139__Mistral-7B-AEZAKMI-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T23:30:41.802824](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Mistral-7B-AEZAKMI-v2/blob/main/results_2024-01-10T23-30-41.802824.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5966405320930094,
"acc_stderr": 0.03315289936870293,
"acc_norm": 0.6024565187511302,
"acc_norm_stderr": 0.03382960096382984,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.01683886288396583,
"mc2": 0.5149993147622676,
"mc2_stderr": 0.01592337993023178
},
"harness|arc:challenge|25": {
"acc": 0.5597269624573379,
"acc_stderr": 0.014506769524804237,
"acc_norm": 0.5810580204778157,
"acc_norm_stderr": 0.014418106953639013
},
"harness|hellaswag|10": {
"acc": 0.635929097789285,
"acc_stderr": 0.004801852881329736,
"acc_norm": 0.8253335988846843,
"acc_norm_stderr": 0.0037890554870031834
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.03878139888797611,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.03878139888797611
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5780346820809249,
"acc_stderr": 0.037657466938651504,
"acc_norm": 0.5780346820809249,
"acc_norm_stderr": 0.037657466938651504
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266346,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159788,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7129032258064516,
"acc_stderr": 0.025736542745594528,
"acc_norm": 0.7129032258064516,
"acc_norm_stderr": 0.025736542745594528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.031911782267135466,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.031911782267135466
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.02503387058301518,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.02503387058301518
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.025049197876042345,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.025049197876042345
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.031693802357129965,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.031693802357129965
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7834862385321101,
"acc_stderr": 0.017658710594443128,
"acc_norm": 0.7834862385321101,
"acc_norm_stderr": 0.017658710594443128
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.033723432716530645,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.033723432716530645
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980979,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980979
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724147,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724147
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02441494730454368,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02441494730454368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7931034482758621,
"acc_stderr": 0.014485656041669175,
"acc_norm": 0.7931034482758621,
"acc_norm_stderr": 0.014485656041669175
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688218,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688218
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28938547486033517,
"acc_stderr": 0.015166544550490308,
"acc_norm": 0.28938547486033517,
"acc_norm_stderr": 0.015166544550490308
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824087,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824087
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4348109517601043,
"acc_stderr": 0.012661233805616292,
"acc_norm": 0.4348109517601043,
"acc_norm_stderr": 0.012661233805616292
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5919117647058824,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.5919117647058824,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.019751726508762637,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.019751726508762637
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.028996909693328923,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.028996909693328923
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653697,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653697
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.01683886288396583,
"mc2": 0.5149993147622676,
"mc2_stderr": 0.01592337993023178
},
"harness|winogrande|5": {
"acc": 0.7363851617995264,
"acc_stderr": 0.012382849299658457
},
"harness|gsm8k|5": {
"acc": 0.3244882486732373,
"acc_stderr": 0.012896095359768106
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Shavindra/satellite-512 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
- name: pixel_values
dtype: image
splits:
- name: train
num_bytes: 332055030.0
num_examples: 304
download_size: 0
dataset_size: 332055030.0
---
# Dataset Card for "satellite-512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lilacai/lilac-glaive-code-assistant | ---
tags:
- Lilac
---
# lilac/glaive-code-assistant
This dataset is a [Lilac](http://lilacml.com) processed dataset. Original dataset: [https://huggingface.co/datasets/glaiveai/glaive-code-assistant](https://huggingface.co/datasets/glaiveai/glaive-code-assistant)
To download the dataset to a local directory:
```bash
lilac download lilacai/lilac-glaive-code-assistant
```
or from python with:
```py
ll.download("lilacai/lilac-glaive-code-assistant")
```
|
Mandala1/elements | ---
dataset_info:
features:
- name: image
dtype: binary
- name: text
dtype: string
splits:
- name: train
num_bytes: 200541
num_examples: 3
download_size: 205936
dataset_size: 200541
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CATTAC/SCAD | ---
license: apache-2.0
---
|
FatemahAlsubaiei/CGSQuAD | ---
task_categories:
- question-answering
language:
- ar
--- |
Matteomasala1997/vocalimiecanto | ---
license: unknown
---
|
jsn27/medical_faq | ---
license: mit
---
|
Denisilva/VOZSuellen | ---
license: openrail
---
|
tuanmanh28/VIVOS_CommonVoice_FOSD_Control_processed_dataset | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: input_values
sequence: float32
- name: input_length
dtype: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 16624719566.846472
num_examples: 41349
- name: test
num_bytes: 1997358586.5
num_examples: 5564
download_size: 17580350437
dataset_size: 18622078153.346474
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "VIVOS_CommonVoice_FOSD_Control_processed_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
haml/newDataset | ---
license: apache-2.0
---
|
VAST-AI/LD-T3D | ---
annotations_creators:
- VastAI
language:
- en
license: mit
size_categories:
- 10K<n<100K
source_datasets:
- Objaverse
task_categories:
- feature-extraction
pretty_name: LD-T3D
dataset_info:
- config_name: default
features:
- name: query_id
dtype: string
- name: target_ids
sequence: string
- name: GT_ids
sequence: string
- name: caption
dtype: string
- name: difficulty
dtype: string
splits:
- name: full
num_bytes: 4518833
num_examples: 1000
- name: train
num_bytes: 3622616
num_examples: 800
- name: test
num_bytes: 896217
num_examples: 200
download_size: 8220035
dataset_size: 9037666
- config_name: pc_npy
features:
- name: source_id
dtype: string
- name: pc
sequence:
sequence: float32
splits:
- name: base
num_bytes: 24989649153
num_examples: 89236
download_size: 14694609454
dataset_size: 24989649153
- config_name: rendered_imgs_above
features:
- name: image
dtype: image
- name: source_id
dtype: string
splits:
- name: base
num_bytes: 3535205800.528
num_examples: 89236
download_size: 3593522799
dataset_size: 3535205800.528
- config_name: rendered_imgs_back
features:
- name: image
dtype: image
- name: source_id
dtype: string
splits:
- name: base
num_bytes: 3603159193
num_examples: 89236
download_size: 3585908828
dataset_size: 3603159193
- config_name: rendered_imgs_below
features:
- name: image
dtype: image
- name: source_id
dtype: string
splits:
- name: base
num_bytes: 3523265309.84
num_examples: 89236
download_size: 3546430113
dataset_size: 3523265309.84
- config_name: rendered_imgs_diag_above
features:
- name: image
dtype: image
- name: source_id
dtype: string
splits:
- name: base
num_bytes: 4447312299.552
num_examples: 89236
download_size: 4478290475
dataset_size: 4447312299.552
- config_name: rendered_imgs_diag_below
features:
- name: image
dtype: image
- name: source_id
dtype: string
splits:
- name: base
num_bytes: 4098391329.84
num_examples: 89236
download_size: 4135673628
dataset_size: 4098391329.84
- config_name: rendered_imgs_front
features:
- name: image
dtype: image
- name: source_id
dtype: string
splits:
- name: base
num_bytes: 3700436427.432
num_examples: 89236
download_size: 3714653215
dataset_size: 3700436427.432
- config_name: rendered_imgs_left
features:
- name: image
dtype: image
- name: source_id
dtype: string
splits:
- name: base
num_bytes: 3204117217.64
num_examples: 89236
download_size: 3174969379
dataset_size: 3204117217.64
- config_name: rendered_imgs_right
features:
- name: image
dtype: image
- name: source_id
dtype: string
splits:
- name: base
num_bytes: 3205641546.992
num_examples: 89236
download_size: 3196672078
dataset_size: 3205641546.992
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: full
path: data/full-*
- config_name: pc_npy
data_files:
- split: base
path: pc_npy/base-*
- config_name: relations
data_files:
- split: full
path: relations/full-*
- config_name: rendered_imgs_above
data_files:
- split: base
path: rendered_imgs_above/base-*
- config_name: rendered_imgs_back
data_files:
- split: base
path: rendered_imgs_back/base-*
- config_name: rendered_imgs_below
data_files:
- split: base
path: rendered_imgs_below/base-*
- config_name: rendered_imgs_diag_above
data_files:
- split: base
path: rendered_imgs_diag_above/base-*
- config_name: rendered_imgs_diag_below
data_files:
- split: base
path: rendered_imgs_diag_below/base-*
- config_name: rendered_imgs_front
data_files:
- split: base
path: rendered_imgs_front/base-*
- config_name: rendered_imgs_left
data_files:
- split: base
path: rendered_imgs_left/base-*
- config_name: rendered_imgs_right
data_files:
- split: base
path: rendered_imgs_right/base-*
tags:
- retrieval
- text-based-3D
- 3D
---

# LD-T3D: A Large-scale and Diverse Benchmark for Text-based 3D Model Retrieval
## Dataset Description
- **Repository:** [VAST-AI/LD-T3D](https://github.com/yuanze1024/LD-T3D)
- **Visualization Demo:** [VAST-AI/LD-T3D 🤗 Space](https://huggingface.co/spaces/VAST-AI/LD-T3D)
- **Paper:** [LD-T3D: A Large-scale and Diverse Benchmark for Text-based 3D Model Retrieval](https://arxiv.org)
- **Point of Contact:** [Ze Yuan](yuanze1024@buaa.edu.cn)
### Dataset Summary
An official dataset repo for paper "**LD-T3D: A Large-scale and Diverse Benchmark for Text-based 3D Model Retrieval**". We introduce a novel Large-scale and Diverse benchmark for Text-based 3D Model Retrieval, named **LD-T3D**, consisting of about 100k text-to-3D model pairs, which include 89k distinct 3D models (collected from **Objaverse**) and 1,000 descriptive text queries.
The federated dataset is divided into 1000 sub-datasets, each sub-dataset corresponds to a textual query and about 100 3D models, and the 3D models contained in the sub-datasets may overlap.
### Dataset Design
1. Text-to-3D Model Relation **(key)**
The format of the data is shown in the dataset viewer.
```python
from datasets import load_dataset # pip install datasets
dataset = load_dataset("VAST-AI/LD-T3D", split="full", cache_dir=cache_dir)
```
You may see log like this:
```shell
Downloading readme: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 6.70k/6.70k [00:00<00:00, 22.7MB/s]
Downloading data: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3.30M/3.30M [00:03<00:00, 1.08MB/s]
Downloading data: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 837k/837k [00:00<00:00, 1.10MB/s]
Downloading data: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4.09M/4.09M [00:00<00:00, 4.42MB/s]
Generating train split: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 800/800 [00:00<00:00, 36971.32 examples/s]
Generating test split: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 200/200 [00:00<00:00, 30699.39 examples/s]
Generating full split: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1000/1000 [00:00<00:00, 42136.87 examples/s]
```
**We also offer some data that we use during the evaluation.**
2. 3D Point Cloud
PC derived from .glb using [openshape pc converter](https://huggingface.co/OpenShape/openshape-demo-support/blob/main/openshape/demo/misc_utils.py).
```python
dataset = load_dataset("VAST-AI/LD-T3D", name="pc_npy", split="base", cache_dir=cache_dir) # {'source_id':str, 'pc':numpy.ndarry}
```
3. Rendered Images in WEBP
```python
for angle in ["diag_below", "diag_above", "right", "left", "back", "front", "above", "below"]
dataset = load_dataset("VAST-AI/LD-T3D", name=f"rendered_imgs_{angle}", split="base", cache_dir=cache_dir) # {'source_id':str, 'image':PIL.Image}
```
4. **Cap3D** Captions for 3D model
```python
data_files = {"captions": "Cap3D_automated_Objaverse_no3Dword.csv"}
dataset = load_dataset("tiange/Cap3D", data_files=data_files, names=["source_id", "caption"], header=None, split='captions', cache_dir=cache_dir)
```
### Other Repo
You can refer to [HF Space](https://huggingface.co/spaces/VAST-AI/LD-T3D) for retrieval visualization demo, or [github repo](https://github.com/yuanze1024/LD-T3D) for more codes to evaluate your customized text-based-3D retrieval methods.
|
WendyHoang/Reference_Extraction_Data | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: label
sequence: string
splits:
- name: train
num_bytes: 4492435.878186377
num_examples: 4307
- name: validation
num_bytes: 499623.12181362306
num_examples: 479
- name: test
num_bytes: 1261001
num_examples: 1235
download_size: 815064
dataset_size: 6253060.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
BangumiBase/sangatsunolion | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Sangatsu No Lion
This is the image base of bangumi Sangatsu no Lion, we detected 33 characters, 3830 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 1087 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 167 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 205 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 49 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 126 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 39 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 179 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 96 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 264 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 111 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 29 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 34 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 19 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 44 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 56 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 27 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 28 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 405 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 203 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 13 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 16 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 142 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 20 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 8 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 23 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 23 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 46 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 55 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 9 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 8 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 39 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 9 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 251 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
christinacdl/Hate_Political_Opponent_2021_Test_Set | ---
license: apache-2.0
language:
- en
---
Test set from "Hate Towards the Political Opponent"(Grimminger et al., 2021) |
open-llm-leaderboard/details_kyujinpy__SOLAR-Platypus-10.7B-v2 | ---
pretty_name: Evaluation run of kyujinpy/SOLAR-Platypus-10.7B-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kyujinpy/SOLAR-Platypus-10.7B-v2](https://huggingface.co/kyujinpy/SOLAR-Platypus-10.7B-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kyujinpy__SOLAR-Platypus-10.7B-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T16:14:50.048840](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__SOLAR-Platypus-10.7B-v2/blob/main/results_2023-12-16T16-14-50.048840.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5933977113371075,\n\
\ \"acc_stderr\": 0.033089600641254734,\n \"acc_norm\": 0.6032526200271864,\n\
\ \"acc_norm_stderr\": 0.033912305079181165,\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.0158463151013948,\n \"mc2\": 0.4314947895428414,\n\
\ \"mc2_stderr\": 0.014252289388190327\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5443686006825939,\n \"acc_stderr\": 0.01455374993930686,\n\
\ \"acc_norm\": 0.5938566552901023,\n \"acc_norm_stderr\": 0.014351656690097862\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6358295160326628,\n\
\ \"acc_stderr\": 0.004802133511654238,\n \"acc_norm\": 0.8356901015733917,\n\
\ \"acc_norm_stderr\": 0.003697992356124479\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n\
\ \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n\
\ \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4553191489361702,\n\
\ \"acc_stderr\": 0.032555253593403555,\n \"acc_norm\": 0.4553191489361702,\n\
\ \"acc_norm_stderr\": 0.032555253593403555\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370331,\n \"\
acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370331\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113729,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113729\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7354838709677419,\n \"acc_stderr\": 0.02509189237885928,\n \"\
acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.02509189237885928\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n \"\
acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586794,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586794\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5641025641025641,\n \"acc_stderr\": 0.025141801511177498,\n\
\ \"acc_norm\": 0.5641025641025641,\n \"acc_norm_stderr\": 0.025141801511177498\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03196876989195778,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03196876989195778\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7908256880733945,\n \"acc_stderr\": 0.017437937173343233,\n \"\
acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.017437937173343233\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437416,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437416\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516301,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516301\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7982120051085568,\n\
\ \"acc_stderr\": 0.014351702181636864,\n \"acc_norm\": 0.7982120051085568,\n\
\ \"acc_norm_stderr\": 0.014351702181636864\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.02615219861972679,\n\
\ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.02615219861972679\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n\
\ \"acc_stderr\": 0.016104833880142295,\n \"acc_norm\": 0.3653631284916201,\n\
\ \"acc_norm_stderr\": 0.016104833880142295\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776165,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776165\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.025976566010862748,\n\
\ \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.025976566010862748\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43415906127770537,\n\
\ \"acc_stderr\": 0.01265903323706725,\n \"acc_norm\": 0.43415906127770537,\n\
\ \"acc_norm_stderr\": 0.01265903323706725\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.02981263070156974,\n\
\ \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.02981263070156974\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.619281045751634,\n \"acc_stderr\": 0.019643801557924806,\n \
\ \"acc_norm\": 0.619281045751634,\n \"acc_norm_stderr\": 0.019643801557924806\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5551020408163265,\n \"acc_stderr\": 0.031814251181977865,\n\
\ \"acc_norm\": 0.5551020408163265,\n \"acc_norm_stderr\": 0.031814251181977865\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533207,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533207\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2876376988984088,\n\
\ \"mc1_stderr\": 0.0158463151013948,\n \"mc2\": 0.4314947895428414,\n\
\ \"mc2_stderr\": 0.014252289388190327\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8145224940805051,\n \"acc_stderr\": 0.010923965303140503\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0401819560272934,\n \
\ \"acc_stderr\": 0.005409439736970511\n }\n}\n```"
repo_url: https://huggingface.co/kyujinpy/SOLAR-Platypus-10.7B-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|arc:challenge|25_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|gsm8k|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hellaswag|10_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T16-14-50.048840.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T16-14-50.048840.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- '**/details_harness|winogrande|5_2023-12-16T16-14-50.048840.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T16-14-50.048840.parquet'
- config_name: results
data_files:
- split: 2023_12_16T16_14_50.048840
path:
- results_2023-12-16T16-14-50.048840.parquet
- split: latest
path:
- results_2023-12-16T16-14-50.048840.parquet
---
# Dataset Card for Evaluation run of kyujinpy/SOLAR-Platypus-10.7B-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kyujinpy/SOLAR-Platypus-10.7B-v2](https://huggingface.co/kyujinpy/SOLAR-Platypus-10.7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kyujinpy__SOLAR-Platypus-10.7B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T16:14:50.048840](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__SOLAR-Platypus-10.7B-v2/blob/main/results_2023-12-16T16-14-50.048840.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5933977113371075,
"acc_stderr": 0.033089600641254734,
"acc_norm": 0.6032526200271864,
"acc_norm_stderr": 0.033912305079181165,
"mc1": 0.2876376988984088,
"mc1_stderr": 0.0158463151013948,
"mc2": 0.4314947895428414,
"mc2_stderr": 0.014252289388190327
},
"harness|arc:challenge|25": {
"acc": 0.5443686006825939,
"acc_stderr": 0.01455374993930686,
"acc_norm": 0.5938566552901023,
"acc_norm_stderr": 0.014351656690097862
},
"harness|hellaswag|10": {
"acc": 0.6358295160326628,
"acc_stderr": 0.004802133511654238,
"acc_norm": 0.8356901015733917,
"acc_norm_stderr": 0.003697992356124479
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370331,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370331
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113729,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113729
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586794,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586794
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5641025641025641,
"acc_stderr": 0.025141801511177498,
"acc_norm": 0.5641025641025641,
"acc_norm_stderr": 0.025141801511177498
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066482,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.017437937173343233,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.017437937173343233
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437416,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437416
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229092,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229092
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516301,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516301
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7982120051085568,
"acc_stderr": 0.014351702181636864,
"acc_norm": 0.7982120051085568,
"acc_norm_stderr": 0.014351702181636864
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.02615219861972679,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.02615219861972679
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.016104833880142295,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.016104833880142295
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776165,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6790123456790124,
"acc_stderr": 0.025976566010862748,
"acc_norm": 0.6790123456790124,
"acc_norm_stderr": 0.025976566010862748
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43415906127770537,
"acc_stderr": 0.01265903323706725,
"acc_norm": 0.43415906127770537,
"acc_norm_stderr": 0.01265903323706725
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5955882352941176,
"acc_stderr": 0.02981263070156974,
"acc_norm": 0.5955882352941176,
"acc_norm_stderr": 0.02981263070156974
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.619281045751634,
"acc_stderr": 0.019643801557924806,
"acc_norm": 0.619281045751634,
"acc_norm_stderr": 0.019643801557924806
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5551020408163265,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.5551020408163265,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533207,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533207
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2876376988984088,
"mc1_stderr": 0.0158463151013948,
"mc2": 0.4314947895428414,
"mc2_stderr": 0.014252289388190327
},
"harness|winogrande|5": {
"acc": 0.8145224940805051,
"acc_stderr": 0.010923965303140503
},
"harness|gsm8k|5": {
"acc": 0.0401819560272934,
"acc_stderr": 0.005409439736970511
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-eval-inverse-scaling__redefine-math-inverse-scaling__redefin-f7efd9-1695359600 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- inverse-scaling/redefine-math
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-1.3b_eval
metrics: []
dataset_name: inverse-scaling/redefine-math
dataset_config: inverse-scaling--redefine-math
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-1.3b_eval
* Dataset: inverse-scaling/redefine-math
* Config: inverse-scaling--redefine-math
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@MicPie](https://huggingface.co/MicPie) for evaluating this model. |
zahide/turkish-instructions-220k | ---
license: apache-2.0
---
This dataset is concatenated from:
https://huggingface.co/datasets/merve/turkish_instructions #51,6k
https://huggingface.co/datasets/ardaorcun/turkish-instruction-dataset-prepared #66k
https://huggingface.co/datasets/CausalLM/GPT-4-Self-Instruct-Turkish #3,08k
https://huggingface.co/datasets/halilibr/collected-turkish-instructions-v0.1 #104k
|
cheafdevo56/Influential_CitedNegs_10_Percent | ---
dataset_info:
features:
- name: query
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: title
dtype: string
- name: pos
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: title
dtype: string
- name: neg
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: score
dtype: int64
- name: title
dtype: string
splits:
- name: train
num_bytes: 173325806.1
num_examples: 45000
- name: validation
num_bytes: 19258422.9
num_examples: 5000
download_size: 115680519
dataset_size: 192584229.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
TimoImhof/SQuAD-V1-in-SQuAD-format | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: context
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: unmodified
num_bytes: 9570059
num_examples: 10552
- name: modified_30_percent
num_bytes: 9577354
num_examples: 10552
- name: modified_100_percent
num_bytes: 9594310
num_examples: 10552
download_size: 9334653
dataset_size: 28741723
---
# Dataset Card for "SQuAD-V1-in-SQuAD-format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/giuseppe_garibaldi_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of giuseppe_garibaldi/ジュゼッペ・ガリバルディ (Kantai Collection)
This is the dataset of giuseppe_garibaldi/ジュゼッペ・ガリバルディ (Kantai Collection), containing 114 images and their tags.
The core tags of this character are `pink_hair, short_hair, breasts, pink_eyes, large_breasts, hat, white_headwear, mini_hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 114 | 113.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/giuseppe_garibaldi_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 114 | 76.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/giuseppe_garibaldi_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 272 | 162.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/giuseppe_garibaldi_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 114 | 104.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/giuseppe_garibaldi_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 272 | 207.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/giuseppe_garibaldi_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/giuseppe_garibaldi_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, solo, navel, alternate_costume, blush, cowboy_shot, simple_background, cleavage, looking_at_viewer, one-hour_drawing_challenge, white_background, bikini, collarbone, smile, twitter_username |
| 1 | 9 |  |  |  |  |  | 1girl, red_skirt, short_sleeves, armpit_cutout, red_shirt, simple_background, solo, white_gloves, black_ribbon, white_background, pleated_skirt, blush, cowboy_shot, looking_at_viewer, medium_breasts, neck_ribbon, smile |
| 2 | 8 |  |  |  |  |  | 1girl, red_footwear, red_skirt, short_sleeves, simple_background, solo, full_body, knee_boots, lace-up_boots, pleated_skirt, red_shirt, white_gloves, armpit_cutout, medium_breasts, white_background, anchor, rigging, sideboob, standing, bangs, machinery, ribbon |
| 3 | 11 |  |  |  |  |  | 1girl, fake_animal_ears, looking_at_viewer, solo, cowboy_shot, rabbit_ears, playboy_bunny, simple_background, white_background, blush, bowtie, detached_collar, wrist_cuffs, cleavage, red_leotard, strapless_leotard, twitter_username, alternate_costume, gloves, hair_between_eyes, hand_on_hip, pantyhose, smile, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | navel | alternate_costume | blush | cowboy_shot | simple_background | cleavage | looking_at_viewer | one-hour_drawing_challenge | white_background | bikini | collarbone | smile | twitter_username | red_skirt | short_sleeves | armpit_cutout | red_shirt | white_gloves | black_ribbon | pleated_skirt | medium_breasts | neck_ribbon | red_footwear | full_body | knee_boots | lace-up_boots | anchor | rigging | sideboob | standing | bangs | machinery | ribbon | fake_animal_ears | rabbit_ears | playboy_bunny | bowtie | detached_collar | wrist_cuffs | red_leotard | strapless_leotard | gloves | hair_between_eyes | hand_on_hip | pantyhose | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:--------|:--------------|:--------------------|:-----------|:--------------------|:-----------------------------|:-------------------|:---------|:-------------|:--------|:-------------------|:------------|:----------------|:----------------|:------------|:---------------|:---------------|:----------------|:-----------------|:--------------|:---------------|:------------|:-------------|:----------------|:---------|:----------|:-----------|:-----------|:--------|:------------|:---------|:-------------------|:--------------|:----------------|:---------|:------------------|:--------------|:--------------|:--------------------|:---------|:--------------------|:--------------|:------------|:-------------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | | | X | X | X | | X | | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | | | | | X | | | | X | | | | | X | X | X | X | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | | X | X | X | X | X | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
00data00/data | ---
license: afl-3.0
---
|
jahb57/gpt2_embeddings_BATCH_1 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: last_hidden_state
sequence:
sequence: float32
splits:
- name: train
num_bytes: 18790180793
num_examples: 100000
download_size: 18838646485
dataset_size: 18790180793
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LisaDuj/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nguyenthanhdo/vhac_v2_chai_format | ---
dataset_info:
features:
- name: model_input
dtype: string
- name: model_output
dtype: string
splits:
- name: train
num_bytes: 369591059.0
num_examples: 108658
download_size: 177238172
dataset_size: 369591059.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "vhac_v2_chai_format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Zahra99/IEMOCAP_Text_another_encoding | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neu
'1': ang
'2': hap
'3': sad
splits:
- name: session1
num_bytes: 71932
num_examples: 1085
- name: session2
num_bytes: 79012
num_examples: 1023
- name: session3
num_bytes: 74980
num_examples: 1151
- name: session4
num_bytes: 72622
num_examples: 1031
- name: session5
num_bytes: 89524
num_examples: 1241
download_size: 217602
dataset_size: 388070
---
# Dataset Card for "IEMOCAP_Text_another_encoding"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KhalfounMehdi/mura_dataset_processed_224px_split | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': abnormal
'1': normal
splits:
- name: train
num_bytes: 897597368.7549056
num_examples: 36004
- name: test
num_bytes: 99746891.24509436
num_examples: 4001
download_size: 997622999
dataset_size: 997344260.0
---
# Dataset Card for "mura_dataset_processed_224px_split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_YeungNLP__firefly-bloom-7b1 | ---
pretty_name: Evaluation run of YeungNLP/firefly-bloom-7b1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YeungNLP/firefly-bloom-7b1](https://huggingface.co/YeungNLP/firefly-bloom-7b1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YeungNLP__firefly-bloom-7b1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-15T03:08:36.849842](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-bloom-7b1/blob/main/results_2023-10-15T03-08-36.849842.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03208892617449664,\n\
\ \"em_stderr\": 0.0018048244787816678,\n \"f1\": 0.1036986157718121,\n\
\ \"f1_stderr\": 0.0023306866623647965,\n \"acc\": 0.326221462409936,\n\
\ \"acc_stderr\": 0.007855425735305286\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.03208892617449664,\n \"em_stderr\": 0.0018048244787816678,\n\
\ \"f1\": 0.1036986157718121,\n \"f1_stderr\": 0.0023306866623647965\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006823351023502654,\n \
\ \"acc_stderr\": 0.0022675371022544836\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6456195737963694,\n \"acc_stderr\": 0.013443314368356088\n\
\ }\n}\n```"
repo_url: https://huggingface.co/YeungNLP/firefly-bloom-7b1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_15T03_08_36.849842
path:
- '**/details_harness|drop|3_2023-10-15T03-08-36.849842.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-15T03-08-36.849842.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_15T03_08_36.849842
path:
- '**/details_harness|gsm8k|5_2023-10-15T03-08-36.849842.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-15T03-08-36.849842.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:41:37.942439.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:41:37.942439.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-17T18:41:37.942439.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_15T03_08_36.849842
path:
- '**/details_harness|winogrande|5_2023-10-15T03-08-36.849842.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-15T03-08-36.849842.parquet'
- config_name: results
data_files:
- split: 2023_08_17T18_41_37.942439
path:
- results_2023-08-17T18:41:37.942439.parquet
- split: 2023_10_15T03_08_36.849842
path:
- results_2023-10-15T03-08-36.849842.parquet
- split: latest
path:
- results_2023-10-15T03-08-36.849842.parquet
---
# Dataset Card for Evaluation run of YeungNLP/firefly-bloom-7b1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/YeungNLP/firefly-bloom-7b1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [YeungNLP/firefly-bloom-7b1](https://huggingface.co/YeungNLP/firefly-bloom-7b1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YeungNLP__firefly-bloom-7b1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T03:08:36.849842](https://huggingface.co/datasets/open-llm-leaderboard/details_YeungNLP__firefly-bloom-7b1/blob/main/results_2023-10-15T03-08-36.849842.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.03208892617449664,
"em_stderr": 0.0018048244787816678,
"f1": 0.1036986157718121,
"f1_stderr": 0.0023306866623647965,
"acc": 0.326221462409936,
"acc_stderr": 0.007855425735305286
},
"harness|drop|3": {
"em": 0.03208892617449664,
"em_stderr": 0.0018048244787816678,
"f1": 0.1036986157718121,
"f1_stderr": 0.0023306866623647965
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.0022675371022544836
},
"harness|winogrande|5": {
"acc": 0.6456195737963694,
"acc_stderr": 0.013443314368356088
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ovior/twitter_dataset_1713093001 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2733690
num_examples: 8029
download_size: 1556287
dataset_size: 2733690
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sunhaozhepy/ag_news_rake_keywords | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': World
'1': Sports
'2': Business
'3': Sci/Tech
- name: keywords
dtype: string
splits:
- name: train
num_bytes: 40094650
num_examples: 120000
- name: test
num_bytes: 2528496
num_examples: 7600
download_size: 26961660
dataset_size: 42623146
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
datahrvoje/twitter_dataset_1713069467 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 19390
num_examples: 42
download_size: 10079
dataset_size: 19390
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
homersimpson/opensubtitles_es | ---
dataset_info:
features:
- name: id
dtype: string
- name: meta
struct:
- name: year
dtype: uint32
- name: imdbId
dtype: uint32
- name: subtitleId
struct:
- name: ca
dtype: uint32
- name: es
dtype: uint32
- name: sentenceIds
struct:
- name: ca
sequence: uint32
- name: es
sequence: uint32
- name: translation
dtype:
translation:
languages:
- ca
- es
splits:
- name: train
num_bytes: 27943115.2
num_examples: 240000
- name: validation
num_bytes: 3492889.4
num_examples: 30000
- name: test
num_bytes: 3492889.4
num_examples: 30000
download_size: 25111166
dataset_size: 34928894.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Rami/my_section_5 | ---
dataset_info:
features:
- name: body
dtype: string
- name: question_id
dtype: string
- name: label
dtype: string
- name: meta_data
struct:
- name: AcceptedAnswerId
dtype: string
- name: CommentCount
dtype: string
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: Score
dtype: string
- name: Tags
sequence: string
- name: Title
dtype: string
- name: answer
struct:
- name: body
dtype: string
- name: comments
list:
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: Score
dtype: string
- name: body
dtype: string
- name: meta_data
struct:
- name: CommentCount
dtype: string
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: ParentId
dtype: string
- name: Score
dtype: string
splits:
- name: train
num_bytes: 557588
num_examples: 71
download_size: 236408
dataset_size: 557588
---
# Dataset Card for "my_section_5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jonatanpcn/Jp | ---
license: openrail
---
|
mayaram/ArabicImageCaptioningAdaset | ---
task_categories:
- image-to-text
language:
- ar
pretty_name: AIC-Dataset
---
*** Image Captioning Dataset
Overview
This dataset is designed for image captioning tasks and consists of a collection of images paired with corresponding captions. The dataset aims to facilitate research and development in the field of image captioning and can be used for training and evaluating image captioning models.
Dataset Details
Number of Images: 9228
Image Sources: Filckr30K
Caption Language: Arabic
|
shikii2/chris | ---
license: openrail
---
|
zolak/twitter_dataset_1713012215 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2742098
num_examples: 6638
download_size: 1348705
dataset_size: 2742098
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Parth/mini-platypus-two-parth | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4186564
num_examples: 1000
download_size: 2245925
dataset_size: 4186564
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
deepaknautiyal/cool_new_dataset | ---
dataset_info:
features:
- name: name
dtype: string
- name: description
dtype: string
- name: ad
dtype: string
splits:
- name: train
num_bytes: 18741
num_examples: 47
download_size: 13891
dataset_size: 18741
---
# Dataset Card for "cool_new_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gabyardi/indian_food_images | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': burger
'1': butter_naan
'2': chai
'3': chapati
'4': chole_bhature
'5': dal_makhani
'6': dhokla
'7': fried_rice
'8': idli
'9': jalebi
'10': kaathi_rolls
'11': kadai_paneer
'12': kulfi
'13': masala_dosa
'14': momos
'15': paani_puri
'16': pakode
'17': pav_bhaji
'18': pizza
'19': samosa
splits:
- name: train
num_bytes: 1295507506.7994332
num_examples: 5328
- name: test
num_bytes: 230102087.3925666
num_examples: 941
download_size: 1601696484
dataset_size: 1525609594.192
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
renyulin/test_ds | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: pos_tags
sequence:
class_label:
names:
'0': '"'
'1': ''''''
'2': '#'
'3': $
'4': (
'5': )
'6': ','
'7': .
'8': ':'
'9': '``'
'10': CC
'11': CD
'12': DT
'13': EX
'14': FW
'15': IN
'16': JJ
'17': JJR
'18': JJS
'19': LS
'20': MD
'21': NN
'22': NNP
'23': NNPS
'24': NNS
'25': NN|SYM
'26': PDT
'27': POS
'28': PRP
'29': PRP$
'30': RB
'31': RBR
'32': RBS
'33': RP
'34': SYM
'35': TO
'36': UH
'37': VB
'38': VBD
'39': VBG
'40': VBN
'41': VBP
'42': VBZ
'43': WDT
'44': WP
'45': WP$
'46': WRB
- name: chunk_tags
sequence:
class_label:
names:
'0': O
'1': B-ADJP
'2': I-ADJP
'3': B-ADVP
'4': I-ADVP
'5': B-CONJP
'6': I-CONJP
'7': B-INTJ
'8': I-INTJ
'9': B-LST
'10': I-LST
'11': B-NP
'12': I-NP
'13': B-PP
'14': I-PP
'15': B-PRT
'16': I-PRT
'17': B-SBAR
'18': I-SBAR
'19': B-UCP
'20': I-UCP
'21': B-VP
'22': I-VP
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-MISC
'8': I-MISC
splits:
- name: train
num_bytes: 6931345
num_examples: 14041
- name: validation
num_bytes: 1739223
num_examples: 3250
- name: test
num_bytes: 1582054
num_examples: 3453
download_size: 1815184
dataset_size: 10252622
---
# Dataset Card for "test_ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceH4/testing_self_instruct_small | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 20379
num_examples: 100
- name: test
num_bytes: 26586
num_examples: 100
download_size: 35875
dataset_size: 46965
---
# Dataset Card for "testing_self_instruct_small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sngsfydy/Disease_Grading_for_DR_and_Mucula | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
splits:
- name: train
num_bytes: 261501746.0
num_examples: 413
- name: test
num_bytes: 64805638.0
num_examples: 103
download_size: 316625605
dataset_size: 326307384.0
---
# Dataset Card for "Disease_Grading_for_DR_and_Mucula"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
renumics/spotlight-cifar100-enrichment | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: prediction
dtype:
class_label:
names:
'0': apple
'1': aquarium_fish
'2': baby
'3': bear
'4': beaver
'5': bed
'6': bee
'7': beetle
'8': bicycle
'9': bottle
'10': bowl
'11': boy
'12': bridge
'13': bus
'14': butterfly
'15': camel
'16': can
'17': castle
'18': caterpillar
'19': cattle
'20': chair
'21': chimpanzee
'22': clock
'23': cloud
'24': cockroach
'25': couch
'26': cra
'27': crocodile
'28': cup
'29': dinosaur
'30': dolphin
'31': elephant
'32': flatfish
'33': forest
'34': fox
'35': girl
'36': hamster
'37': house
'38': kangaroo
'39': keyboard
'40': lamp
'41': lawn_mower
'42': leopard
'43': lion
'44': lizard
'45': lobster
'46': man
'47': maple_tree
'48': motorcycle
'49': mountain
'50': mouse
'51': mushroom
'52': oak_tree
'53': orange
'54': orchid
'55': otter
'56': palm_tree
'57': pear
'58': pickup_truck
'59': pine_tree
'60': plain
'61': plate
'62': poppy
'63': porcupine
'64': possum
'65': rabbit
'66': raccoon
'67': ray
'68': road
'69': rocket
'70': rose
'71': sea
'72': seal
'73': shark
'74': shrew
'75': skunk
'76': skyscraper
'77': snail
'78': snake
'79': spider
'80': squirrel
'81': streetcar
'82': sunflower
'83': sweet_pepper
'84': table
'85': tank
'86': telephone
'87': television
'88': tiger
'89': tractor
'90': train
'91': trout
'92': tulip
'93': turtle
'94': wardrobe
'95': whale
'96': willow_tree
'97': wolf
'98': woman
'99': worm
- name: prediction_error
dtype: bool
- name: probability
dtype: float32
- name: entropy
dtype: float32
- name: embedding_reduced
sequence: float32
length: 2
- name: embedding
sequence: float32
length: 768
splits:
- name: train
num_bytes: 154806250
num_examples: 50000
- name: test
num_bytes: 30961250
num_examples: 10000
download_size: 223227009
dataset_size: 185767500
---
# Dataset Card for "spotlight-cifar100-enrichment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
katarinagresova/Genomic_Benchmarks_human_enhancers_ensembl | ---
dataset_info:
features:
- name: seq
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 34821392
num_examples: 123872
- name: test
num_bytes: 8668172
num_examples: 30970
download_size: 4077057
dataset_size: 43489564
---
# Dataset Card for "Genomic_Benchmarks_human_enhancers_ensembl"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TheAIchemist13/hindi_asr_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcriptions
dtype: string
splits:
- name: train
num_bytes: 24441695.0
num_examples: 80
- name: test
num_bytes: 32809156.0
num_examples: 90
download_size: 28788848
dataset_size: 57250851.0
---
# Dataset Card for "hindi_asr_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nhantruongcse/summary-vietnamese-news-token-TFtest_vit5_base | ---
dataset_info:
features:
- name: Content
dtype: string
- name: Summary
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 61956745
num_examples: 8229
download_size: 27478662
dataset_size: 61956745
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Asap7772/Flatten-Math-Shepherd_0.9_2.0_-2.0_False | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: next_prompt
dtype: string
- name: next_response
dtype: string
- name: label
dtype: string
- name: question
dtype: string
- name: step
dtype: int64
- name: trajectory
dtype: int64
- name: mask
dtype: int64
- name: reward
dtype: float64
- name: mc_values
dtype: float64
splits:
- name: train
num_bytes: 4279469183
num_examples: 2482945
- name: test
num_bytes: 491798737
num_examples: 283159
download_size: 883496918
dataset_size: 4771267920
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
freddyaboulton/my-solid-theme |
---
tags: [gradio-theme]
title: My Solid Theme
colorFrom: orange
colorTo: purple
sdk: gradio
sdk_version: 3.16.2
app_file: app.py
pinned: false
license: apache-2.0
---
# My Solid Theme
## Description
A copy of the solid theme
## Preview

## Contributions
Thanks to [@freddyaboulton](https://huggingface.co/freddyaboulton) for adding this gradio theme!
|
mrbesher/tr-paraphrase-opensubtitles2018 | ---
license: cc-by-4.0
---
|
kensho/BizBench | ---
license: apache-2.0
---
|
Nexdata/194999_Uyghur_Pronunciation_Dictionary | ---
license: cc-by-nc-nd-4.0
---
## Description
The Uyghur pronunciation dictionary collects 194,999 Uyghur words and pronunciations with accurate pronunciation. The dictionary can be used to provide pronunciation reference for sound recording personnel, research and development of pronunciation recognition technology, etc.
For more details, please refer to the link: https://www.nexdata.ai/dataset/47?source=Huggingface
# Specifications
## Data Size
194,999 words in total
## Content
Uyghur words and pronunciation
## Producer
All words are collecting by web crawling
# Licensing Information
Commercial License
|
distilled-from-one-sec-cv12/chunk_250 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 883345500
num_examples: 172125
download_size: 903122289
dataset_size: 883345500
---
# Dataset Card for "chunk_250"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marcus2000/saiga_pravo_dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 8597758.781102799
num_examples: 2887
- name: test
num_bytes: 2150184.2188972016
num_examples: 722
download_size: 4318154
dataset_size: 10747943.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_Stopwolf__DistilabelCerberus-7B-slerp | ---
pretty_name: Evaluation run of Stopwolf/DistilabelCerberus-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Stopwolf/DistilabelCerberus-7B-slerp](https://huggingface.co/Stopwolf/DistilabelCerberus-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Stopwolf__DistilabelCerberus-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T01:28:41.378025](https://huggingface.co/datasets/open-llm-leaderboard/details_Stopwolf__DistilabelCerberus-7B-slerp/blob/main/results_2024-02-02T01-28-41.378025.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6464333239348551,\n\
\ \"acc_stderr\": 0.032147073947899604,\n \"acc_norm\": 0.6464670335536932,\n\
\ \"acc_norm_stderr\": 0.03280457322475929,\n \"mc1\": 0.4455324357405141,\n\
\ \"mc1_stderr\": 0.017399335280140354,\n \"mc2\": 0.609312831167026,\n\
\ \"mc2_stderr\": 0.015494903078684579\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6544368600682594,\n \"acc_stderr\": 0.013896938461145687,\n\
\ \"acc_norm\": 0.681740614334471,\n \"acc_norm_stderr\": 0.013611993916971453\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6928898625771759,\n\
\ \"acc_stderr\": 0.004603527017557838,\n \"acc_norm\": 0.867755427205736,\n\
\ \"acc_norm_stderr\": 0.003380641470989925\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.023025899617188723,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.023025899617188723\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590167,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590167\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.02675640153807897,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02675640153807897\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934724,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934724\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\
\ \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n\
\ \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n\
\ \"acc_stderr\": 0.01627792703963819,\n \"acc_norm\": 0.3854748603351955,\n\
\ \"acc_norm_stderr\": 0.01627792703963819\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729477,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729477\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959603,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959603\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.012752858346533131,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.012752858346533131\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4455324357405141,\n\
\ \"mc1_stderr\": 0.017399335280140354,\n \"mc2\": 0.609312831167026,\n\
\ \"mc2_stderr\": 0.015494903078684579\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.011350315707462057\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6982562547384382,\n \
\ \"acc_stderr\": 0.012643544762873354\n }\n}\n```"
repo_url: https://huggingface.co/Stopwolf/DistilabelCerberus-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|arc:challenge|25_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|gsm8k|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hellaswag|10_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T01-28-41.378025.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T01-28-41.378025.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- '**/details_harness|winogrande|5_2024-02-02T01-28-41.378025.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T01-28-41.378025.parquet'
- config_name: results
data_files:
- split: 2024_02_02T01_28_41.378025
path:
- results_2024-02-02T01-28-41.378025.parquet
- split: latest
path:
- results_2024-02-02T01-28-41.378025.parquet
---
# Dataset Card for Evaluation run of Stopwolf/DistilabelCerberus-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Stopwolf/DistilabelCerberus-7B-slerp](https://huggingface.co/Stopwolf/DistilabelCerberus-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Stopwolf__DistilabelCerberus-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T01:28:41.378025](https://huggingface.co/datasets/open-llm-leaderboard/details_Stopwolf__DistilabelCerberus-7B-slerp/blob/main/results_2024-02-02T01-28-41.378025.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6464333239348551,
"acc_stderr": 0.032147073947899604,
"acc_norm": 0.6464670335536932,
"acc_norm_stderr": 0.03280457322475929,
"mc1": 0.4455324357405141,
"mc1_stderr": 0.017399335280140354,
"mc2": 0.609312831167026,
"mc2_stderr": 0.015494903078684579
},
"harness|arc:challenge|25": {
"acc": 0.6544368600682594,
"acc_stderr": 0.013896938461145687,
"acc_norm": 0.681740614334471,
"acc_norm_stderr": 0.013611993916971453
},
"harness|hellaswag|10": {
"acc": 0.6928898625771759,
"acc_stderr": 0.004603527017557838,
"acc_norm": 0.867755427205736,
"acc_norm_stderr": 0.003380641470989925
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.036812296333943194,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.036812296333943194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188723,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188723
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590167,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02675640153807897,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02675640153807897
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934724,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934724
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066307,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3854748603351955,
"acc_stderr": 0.01627792703963819,
"acc_norm": 0.3854748603351955,
"acc_norm_stderr": 0.01627792703963819
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729477,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729477
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959603,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959603
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533131,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533131
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4455324357405141,
"mc1_stderr": 0.017399335280140354,
"mc2": 0.609312831167026,
"mc2_stderr": 0.015494903078684579
},
"harness|winogrande|5": {
"acc": 0.7947908445146015,
"acc_stderr": 0.011350315707462057
},
"harness|gsm8k|5": {
"acc": 0.6982562547384382,
"acc_stderr": 0.012643544762873354
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/elimine_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of elimine (Fire Emblem)
This is the dataset of elimine (Fire Emblem), containing 17 images and their tags.
The core tags of this character are `blonde_hair, breasts, long_hair, green_eyes, very_long_hair, bangs, medium_breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 26.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elimine_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 13.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elimine_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 38 | 26.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elimine_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 22.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elimine_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 38 | 39.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/elimine_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/elimine_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, solo, smile, cape, white_dress, looking_at_viewer, elbow_gloves, holding, white_gloves, closed_mouth, simple_background, armlet, bracelet, long_dress, staff, full_body, open_mouth, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | cape | white_dress | looking_at_viewer | elbow_gloves | holding | white_gloves | closed_mouth | simple_background | armlet | bracelet | long_dress | staff | full_body | open_mouth | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-------|:--------------|:--------------------|:---------------|:----------|:---------------|:---------------|:--------------------|:---------|:-----------|:-------------|:--------|:------------|:-------------|:-------------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Vipitis/Shadertoys | ---
annotations_creators:
- no-annotation
language:
- en
- code
language_creators:
- machine-generated
license:
- cc-by-nc-sa-3.0
multilinguality: []
pretty_name: Shadertoys
size_categories:
- 10K<n<100K
source_datasets: []
tags:
- code
task_categories:
- text-generation
- text-to-image
task_ids: []
dataset_info:
features:
- name: num_passes
dtype: int64
- name: has_inputs
dtype: bool
- name: name
dtype: string
- name: type
dtype: string
- name: code
dtype: string
- name: title
dtype: string
- name: description
dtype: string
- name: tags
sequence: string
- name: author
dtype: string
- name: license
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 162960894
num_examples: 37841
- name: test
num_bytes: 26450429
num_examples: 6617
download_size: 86294414
dataset_size: 189411323
---
# Dataset Card for Shadertoys
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Source Data](#source-data)
- [Licensing Information](#licensing-information)
## Dataset Description
- **Repository:** https://github.com/Vipitis/shadertoys-dataset
### Dataset Summary
The Shadertoys dataset contains over 44k renderpasses collected from the Shadertoy.com API. Some shader programm contain multiple render passes.
To browse a subset of this dataset, look at the [ShaderEval](https://huggingface.co/spaces/Vipitis/ShaderCoder) space. A finer variant of this dataset is [Shadertoys-fine](https://huggingface.co/datasets/Vipitis/Shadertoys-fine).
### Supported Tasks and Leaderboards
`text-generation` the dataset can be used to train generative language models, for code completion tasks.
`ShaderEval` [task1](https://huggingface.co/spaces/Vipitis/ShaderEval) from ShaderEval uses a dataset derived from Shadertoys to test return completion of autoregressive language models.
### Languages
- English (title, description, tags, comments)
- Shadercode **programming** language, a subset of GLSL specifically for Shadertoy.com
## Dataset Structure
### Data Instances
A data point consists of the whole shadercode, some information from the API as well as additional metadata.
```
{
'num_passes': 1,
'has_inputs': False,
'name': 'Image',
'type': 'image',
'code': '<full code>',
'title': '<title of the shader>',
'description': '<description of the shader>',
'tags': ['tag1','tag2','tag3', ... ],
'license': 'unknown',
'author': '<username>',
'source': 'https://shadertoy.com/view/<shaderID>'
}
```
### Data Fields
- 'num_passes' number of passes the parent shader program has
- 'has_inputs' if any inputs were used like textures, audio streams,
- 'name' Name of the renderpass, usually Image, Buffer A, Common, etc
- 'type' type of the renderpass; one of `{'buffer', 'common', 'cubemap', 'image', 'sound'}`
- 'code' the raw code (including comments) the whole renderpass.
- 'title' Name of the Shader
- 'description' description given for the Shader
- 'tags' List of tags assigned to the Shader (by it's creator); there are more than 10000 unique tags.
- 'license' currently in development
- 'author' username of the shader author
- 'source' URL to the shader. Not to the specific renderpass.
### Data Splits
Currently available (shuffled):
- train (85.0%)
- test (15.0%)
## Dataset Creation
Data retrieved starting 2022-07-20
### Source Data
#### Initial Data Collection and Normalization
All data was collected via the [Shadertoy.com API](https://www.shadertoy.com/howto#q2) and then iterated over the items in 'renderpass' while adding some of the fields from 'info'.
The code to generate these datasets should be published on the GitHub repository in the near future.
#### Who are the source language producers?
Shadertoy.com contributers which publish shaders as 'public+API'
## Licensing Information
The Default [license for each Shader](https://www.shadertoy.com/terms) is CC BY-NC-SA 3.0. However, some Shaders might have a different license attached.
The Dataset is currently not filtering for any licenses but gives a license tag, if easily recognizeable by naive means.
Please check the first comment of each shader program yourself as to not violate any copyrights for downstream use. The main license requires share alike and by attribution.
Attribution of every data field can be found in the 'author' column, but might not include further attribution within the code itself or parents from forked shaders. |
renumics/spotlight-beans-enrichment | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image_file_path.embedding
sequence: float32
length: 2
- name: image.embedding
sequence: float32
length: 2
splits:
- name: train
num_bytes: 16544
num_examples: 1034
- name: validation
num_bytes: 2128
num_examples: 133
- name: test
num_bytes: 2048
num_examples: 128
download_size: 33961
dataset_size: 20720
---
# Dataset Card for "spotlight-beans-enrichment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
manycore-research/PlankAssembly | ---
license: cc-by-nc-nd-4.0
size_categories:
- 10K<n<100K
---
# PlankAssembly Dataset
If you encounter downloading issue, you can directly download the dataset [here](https://manycore-research-azure.kujiale.com/manycore-research/PlankAssembly/data.zip).
## Dataset Description
- **Homepage:** https://manycore-research.github.io/PlankAssembly
- **Repository:** https://github.com/manycore-research/PlankAssembly
- **Paper:** https://arxiv.org/abs/2308.05744
### Dataset Summary
This is the dataset used for training [PlankAssembly](https://manycore-research.github.io/PlankAssembly). It contains 26,707 shape programs derived from parametric CAD models.
## Dataset Structure
PlankAssembly dataset is a directory with the following structure:
PlankAssemblyDataset
├── model # shape program
| └── <MODLE_ID>.json
└── splits # dataset splits
├── train.txt
├── valid.txt
└── test.txt
## PlankAssembly DSL
A cabinet is typically assembled by a list of plank models, where each plank is represented as an axis-aligned cuboid. A cuboid has six degrees of freedom, which correspond to the starting and ending coordinates along the three axes:
```
Cuboid (x_min, y_min, z_min, x_max, y_max, z_max).
```
Each coordinate can either take a numerical value or be a pointer to the corresponding coordinate of another cuboid (to which it attaches to).
In the parametric modeling software, a plank is typically created by first drawing a 2D profile and then applying the extrusion command. Thus, we categorize the faces of each plank into *sideface* or *endface*, depending on whether they are along the direction of the extrusion or not. Then, given a pair of faces from two different planks, we consider that an attachment relationship exists if (i) the two faces are within a distance threshold of 1mm and (ii) the pair consists of one sideface and one endface.
## Shape Program
Each shape program (*model.json*) is a JSON file with the following structure:
```python
{
# model id
"name": str,
# numerical values of all planks, the units are millimeters
"planks": List[List], # N x 6
# extrusion direction of each plank
"normal": List[List], # N x 3
# attachment relationships
# -1 denotes no attachment relationship
# Others denote the index of the flattened plank sequence
"attach": List[List], # N x 6
}
```
## BibTex
Please cite our paper if you use PlankAssembly dataset in your work:
```bibtex
@inproceedings{PlankAssembly,
author = {Hu, Wentao and Zheng, Jia and Zhang, Zixin and Yuan, Xiaojun and Yin, Jian and Zhou, Zihan},
title = {PlankAssembly: Robust 3D Reconstruction from Three Orthographic Views with Learnt Shape Programs},
booktitle = {ICCV},
year = {2023}
}
``` |
gonglinyuan/mbpp_with_prompt | ---
license: cc-by-4.0
---
|
tyzhu/lmind_hotpot_train500_eval300_v1_qa | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
splits:
- name: train_qa
num_bytes: 84812
num_examples: 500
- name: train_recite_qa
num_bytes: 525773
num_examples: 500
- name: eval_qa
num_bytes: 49916
num_examples: 300
- name: eval_recite_qa
num_bytes: 324839
num_examples: 300
- name: all_docs
num_bytes: 738612
num_examples: 1594
- name: all_docs_eval
num_bytes: 738503
num_examples: 1594
- name: train
num_bytes: 84812
num_examples: 500
- name: validation
num_bytes: 49916
num_examples: 300
download_size: 1623187
dataset_size: 2597183
---
# Dataset Card for "lmind_hotpot_train500_eval300_v1_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ybubnou/pubg | ---
license: odbl
---
|
Softage-AI/AI-tool-agents_dataset | ---
task_categories:
- feature-extraction
language:
- en
tags:
- data_annotation
- data
- AI
- training
- audio and video data annotation
size_categories:
- n<1K
license: mit
---
# Annotation Techniques Sample Database
## Description
Explore this dataset containing 54 queries from 54 different tools/software. It serves as a versatile resource for building tool-specific Assistant LLMs, including, information retrieval, and model training.
## Data attributes
- Tools: List of 54 different software or tools
- Audio Prompt: An auditory cue is provided to correspond to the actions for the text responses
- Text Prompt: Written instructions guiding or prompting particular activities or tasks
- Video File: A digital file containing visual information, likely used for presenting video content within the dataset
- Action File: Refer to the recording of key presses, mouse clicks, and mouse movements made by the action recorder. These logs are essential for understanding the sequence and frequency of user inputs, resulting in the creation of logs to explain the prompt actions.
- Output File: The result generated from specific operations or processing within the dataset
- OS: The operating system environment under which the associated tools data have been generated, either for MAC or WINDOWS
- Bit rate: The rate at which bits are processed or transmitted, often referring to audio or video data compression
- Frequency: The number of occurrences of a repeating event per unit of time, associated with audio signals in this dataset, measured in hertz (Hz)
## Limitations & Biases
- Bias may arise from selecting tools based on popularity and industry relevance, potentially favoring widely used tools and their associated use cases.
- The dataset may include the most common approach for performing actions in a tool, potentially overlooking alternative methods.
- The dataset does not encompass actions related to ordering, payments, or card transactions in tools. This limitation arises from the avoidance of sensitive transactions, requiring subject matter expertise or team member involvement.
- Certain tools in the dataset were recorded using trial versions, while premium versions were available for others. Consequently, this introduces limitations in the functionality of some tools.
## Potential use cases
Train LLMs on user interactions (key presses, mouse movements, outputs) within various software for comprehensive evaluation of their ability to understand their behavior patterns and mimic user actions.
## Data Source
This dataset is created by the delivery team @SoftAge |
Borrri/Borri4 | ---
license: cc
---
|
ontocord/OIG-moderation | ---
license: apache-2.0
---
# This is the Open Instruction Generalist - Moderation Dataset
This is our attempt to create a diverse dataset of user dialogue that may be related to NSFW subject matters, abuse eliciting text, privacy violation eliciting instructions, depression or related content, hate speech, and other similar topics. We use the [prosocial], [anthropic redteam], subsets of [English wikipedia] datasets along with other public datasets described below and data created or contributed by volunteers. To regularize the dataset we also have "regular" OIG instructions, which includes Q/A instructions, coding instructions, and similar types of queries. We only have the user prompts and not a potential reply by a bot. Currently there are two versions of the datasets.
- OIG_safety_v0.1.jsonl (66200)
- OIG_safety_v0.2.jsonl (134530)
OIG-moderation includes data from:
* The train split of public datasets such as anthropic-redteam and anthropic-harmless, prosocial, and contributed datasets from community members
* Augmented toxic data such as civil comments data converted into instructions, the train set for anthropic-redteam data augmented with prosocial tags
* Data provided by the LAION community that might include NSFW prompt
* Synthetic depression data generated from a public depression bag of words dataset https://huggingface.co/datasets/joangaes/depression using https://huggingface.co/pszemraj/flan-t5-large-grammar-synthesis.
* A model trained on the OIG-moderation dataset can be used to provide moderation labels, and the bot providers can choose to then block responses from their chatbots based on these labels. If a bot provider's policy for example permits sexual content, but prohibits PII eliciting text, they can hopefully do so with the output of a model trained on this data.
* The tags consist of (a) Base prosocial tags: casual, possibly needs caution, probably needs caution, needs caution, needs intervention and (b) Additional tags: abuse related, personal information related, sexual content, hate.
* An utterance can have more than one tag. For example, a wikipedia article about pornography content might be tagged: needs caution | sexual content.
## Models & How To Use
[Build custom chatbot applications using OpenChatkit models on Amazon SageMaker](https://aws.amazon.com/blogs/machine-learning/build-custom-chatbot-applications-using-openchatkit-models-on-amazon-sagemaker/)
> OpenChatKit has a 6-billion-parameter moderation model, [GPT-JT-Moderation-6B](https://huggingface.co/togethercomputer/GPT-JT-Moderation-6B), which can moderate the chatbot to limit the inputs to the moderated subjects. Although the model itself does have some moderation built in, TogetherComputer trained a GPT-JT-Moderation-6B model with Ontocord.ai’s OIG-moderation dataset. This model runs alongside the main chatbot to check that both the user input and answer from the bot don’t contain inappropriate results. You can also use this to detect any out of domain questions to the chatbot and override when the question is not part of the chatbot’s domain.
## Acknowledgement
* We would like to thank all the following people for their amazing contirbutions: @Rallio, @Summer, @Iamiakk @Jue, @yp_yurilee, @Jjmachan, @Coco.han, @Pszemraj, and many others.
* We would like to thank Together.xyz for testing the v0.1 data for effectiveness and their dedication to the open source community.
* We would like to thank AI Horde and user @Db0 for their incredible contribution of filtered data that were flagged as unethical.
## Disclaimer
* These datasets contain synthetic data and in some cases data that includes NSFW subject matter and triggering text such as toxic/offensive/trolling things. If you are concerned about the presence of this type of material in the dataset please make sure you carefully inspect each of the entries and filter appropriately. Our goal is for the model to be as helpful and non-toxic as possible and we are actively evaluating ways to help create models that can detect potentially unwanted or problematic instructions or content.
## Risk Factors
* While we acknowledge that this dataset can be modified to train a model to generate unsafe text, it is important to release this publicly as a resource for both researchers and those building production agents to train detection models.
## BY ACCESSING THIS DATASET YOU AGREE YOU ARE 18 YEARS OLD OR OLDER AND UNDERSTAND THE RISKS OF USING THIS DATASET. |
ibranze/araproje_hellaswag_en_s5 | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 149738.0
num_examples: 250
download_size: 82789
dataset_size: 149738.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_en_s5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thisisHJLee/1cycle_data_2757 | ---
license: apache-2.0
---
|
senhorsapo/enel | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_cola_invariant_tag_amnt | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: train
num_bytes: 329
num_examples: 5
download_size: 2142
dataset_size: 329
---
# Dataset Card for "MULTI_VALUE_cola_invariant_tag_amnt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cloudythe/lauanvitor | ---
license: openrail
---
|
allegro/klej-dyk | ---
annotations_creators:
- expert-generated
language_creators:
- other
language:
- pl
license:
- cc-by-sa-3.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- open-domain-qa
pretty_name: Did you know?
---
# klej-dyk
## Description
The Czy wiesz? (eng. Did you know?) the dataset consists of almost 5k question-answer pairs obtained from Czy wiesz... section of Polish Wikipedia. Each question is written by a Wikipedia collaborator and is answered with a link to a relevant Wikipedia article. In huggingface version of this dataset, they chose the negatives which have the largest token overlap with a question.
## Tasks (input, output, and metrics)
The task is to predict if the answer to the given question is correct or not.
**Input** ('question sentence', 'answer' columns): question and answer sentences
**Output** ('target' column): 1 if the answer is correct, 0 otherwise.
**Domain**: Wikipedia
**Measurements**: F1-Score
**Example**:
Input: `Czym zajmowali się świątnicy?` ; `Świątnik – osoba, która dawniej zajmowała się
obsługą kościoła (świątyni).`
Input (translated by DeepL): `What did the sacristans do?` ; `A sacristan - a person who used to be in charge of the handling the church (temple).`
Output: `1` (the answer is correct)
## Data splits
| Subset | Cardinality |
| ----------- | ----------: |
| train | 4154 |
| val | 0 |
| test | 1029 |
## Class distribution
| Class | train | validation | test |
|:----------|--------:|-------------:|-------:|
| incorrect | 0.831 | - | 0.831 |
| correct | 0.169 | - | 0.169 |
## Citation
```
@misc{11321/39,
title = {Pytania i odpowiedzi z serwisu wikipedyjnego "Czy wiesz", wersja 1.1},
author = {Marci{\'n}czuk, Micha{\l} and Piasecki, Dominik and Piasecki, Maciej and Radziszewski, Adam},
url = {http://hdl.handle.net/11321/39},
note = {{CLARIN}-{PL} digital repository},
year = {2013}
}
```
## License
```
Creative Commons Attribution ShareAlike 3.0 licence (CC-BY-SA 3.0)
```
## Links
[HuggingFace](https://huggingface.co/datasets/dyk)
[Source](http://nlp.pwr.wroc.pl/en/tools-and-resources/resources/czy-wiesz-question-answering-dataset)
[Source #2](https://clarin-pl.eu/dspace/handle/11321/39)
[Paper](https://www.researchgate.net/publication/272685895_Open_dataset_for_development_of_Polish_Question_Answering_systems)
## Examples
### Loading
```python
from pprint import pprint
from datasets import load_dataset
dataset = load_dataset("allegro/klej-dyk")
pprint(dataset['train'][100])
#{'answer': '"W wyborach prezydenckich w 2004 roku, Moroz przekazał swoje '
# 'poparcie Wiktorowi Juszczence. Po wyborach w 2006 socjaliści '
# 'początkowo tworzyli ""pomarańczową koalicję"" z Naszą Ukrainą i '
# 'Blokiem Julii Tymoszenko."',
# 'q_id': 'czywiesz4362',
# 'question': 'ile partii tworzy powołaną przez Wiktora Juszczenkę koalicję '
# 'Blok Nasza Ukraina?',
# 'target': 0}
```
### Evaluation
```python
import random
from pprint import pprint
from datasets import load_dataset, load_metric
dataset = load_dataset("allegro/klej-dyk")
dataset = dataset.class_encode_column("target")
references = dataset["test"]["target"]
# generate random predictions
predictions = [random.randrange(max(references) + 1) for _ in range(len(references))]
acc = load_metric("accuracy")
f1 = load_metric("f1")
acc_score = acc.compute(predictions=predictions, references=references)
f1_score = f1.compute(predictions=predictions, references=references, average="macro")
pprint(acc_score)
pprint(f1_score)
# {'accuracy': 0.5286686103012633}
# {'f1': 0.46700507614213194}
``` |
LeonardoTiger/caustic | ---
license: openrail
---
|
xzxy2023412/diyici | ---
license: openrail
---
|
theblackcat102/wmt19-conversations | ---
license: unknown
---
|
HDanh/real_gen_dateset | ---
license: apache-2.0
task_categories:
- image-classification
language:
- en
pretty_name: ReFa
size_categories:
- 1K<n<10K
--- |
brando/Coq-Gym-Data-Set | ---
license: apache-2.0
dataset_info:
features:
- name: relevant_lemmas
sequence: string
- name: prev_tactics
sequence: string
- name: context
struct:
- name: bg_goals
list:
- name: goal
dtype: string
- name: hypotheses
sequence: string
- name: fg_goals
list:
- name: goal
dtype: string
- name: hypotheses
sequence: string
- name: given_up_goals
list:
- name: goal
dtype: string
- name: hypotheses
sequence: string
- name: shelved_goals
list:
- name: goal
dtype: string
- name: hypotheses
sequence: string
- name: tactic
dtype: string
splits:
- name: test
num_bytes: 4006839384
num_examples: 363042
download_size: 27586028
dataset_size: 4006839384
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
## Proverbot Scrapes
Here we include a dump of proofs in coq-gym using the proverbot9001 tool. |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_142 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1070297664.0
num_examples: 210192
download_size: 1092354390
dataset_size: 1070297664.0
---
# Dataset Card for "chunk_142"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
peiranli0930/L-SVD | ---
license: bsd-3-clause
task_categories:
- video-classification
language:
- en
pretty_name: 'Large-Scale Selfie Video Dataset (L-SVD): A Benchmark for Emotion Recognition'
size_categories:
- 10K<n<100K
---
# Large-Scale Selfie Video Dataset (L-SVD): A Benchmark for Emotion Recognition [HomePage](https://github.com/PeiranLi0930/L-SVD)
## We are releasing the dataset in batches
## Validated Batch 1: [Link](https://drive.google.com/drive/folders/1alXjtSisiDHY3akoReIU6V2AzbvW0rau?usp=sharing)<br/>The second batch will be ready before Feb 25th.
## Note: Please specify your Contact Info and Affiliation(s) when requesting access
## Welcome to L-SVD
L-SVD is a comprehensive and meticulously curated video dataset designed to revolutionize the field of emotion recognition. With over 20,000 short video clips, each precisely annotated to reflect a wide range of human emotions, L-SVD serves as a pivotal resource at the confluence of Cognitive Science, Psychology, Computer Science, and Medical Science. Our dataset is crafted to advance research and applications within these dynamic fields, offering an unparalleled tool for innovation and discovery.
### Why L-SVD?
Drawing inspiration from the transformative ImageNet, L-SVD aims to establish itself as a cornerstone in the domain of emotional AI. We provide the global research community with a dataset characterized by its detailed labeling and uniform processing standards, ensuring high-quality video data for cutting-edge research and development.
#### Key Features
- **Rich Emotional Annotations**: L-SVD encompasses a spectrum of eight emotions—Anger, Contempt, Disgust, Enjoyment, Fear, Sadness, Surprise, and Neutral. Each emotion is annotated with unparalleled precision, providing a robust foundation for emotion recognition algorithms.
- **Uniform Video Quality**: To facilitate algorithm development and testing, all videos within L-SVD maintain consistent hue, contrast, and brightness, ensuring a standardized quality baseline across the dataset.
- **Community-Driven Expansion**: L-SVD is in a state of continuous growth, with contributions from the global community enriching the dataset's diversity and depth.
### Dataset Features
- **Comprehensive Emotional Spectrum**: Our dataset offers a wide-ranging exploration of human emotions, each meticulously labeled to support precise recognition and analysis.
- **Optimized for Research Excellence**: Through careful pre-processing, L-SVD sets a benchmark for training data quality, offering high fidelity and uniformity across all clips.
- **Global Participation**: We warmly invite researchers and practitioners worldwide to contribute to L-SVD, fostering a diverse and expansive dataset.
## How to Contribute
Your contributions are essential to the growth and success of L-SVD. To contribute, please follow the instructions to upload your data [HERE](https://drive.google.com/drive/folders/1s-Ar6O2g-IYYXheRkO01FHiuGykjSeX6?usp=sharing). We will review and validate the labels within a few days of submission.
Join us in advancing the fields of Machine Learning and Deep Learning! After submitting your data, please email [ME](mailto:pli258@wisc.edu) with the details of your submission, including filepaths, modalities, affiliations, and GitHub Username. We look forward to acknowledging your valuable contributions on our homepage.
## Getting Started
Our dataset, L-SVD, is shared via Google Drive, enabling easy access and collaboration. The dataset is released in batches, ensuring ongoing updates and expansions.
To access L-SVD, please visit [U-SVD](https://drive.google.com/drive/folders/1alXjtSisiDHY3akoReIU6V2AzbvW0rau?usp=sharing) and submit a request including your Contact Information and Affiliations. This process ensures a collaborative and secure environment for all users.
Thank you for your interest in L-SVD. Together, we can push the boundaries of emotion recognition research and development.
### Usage Example
```python
# Example code to load the L-SVD dataset
import emotionnet
# Load dataset
dataset = emotionnet.load('/path/to/emotionnet')
# Loop through the dataset
for video in dataset:
frames, emotions = video['frames'], video['emotions']
# Insert your model training or evaluation code here
```
### Citation
If you use L-SVD in your academic or industry research, please cite it as follows:
```bibtex
@misc{emotionnet2023,
title={L-SVD: A Comprehensive Video Dataset for Emotion Recognition},
author={Peiran L, Linbo T, Xizheng Y. University of Wisconsin Madison},
year={2024},
publisher={\url{https://github.com/PeiranLi0930}},
journal={*},
howpublished={\url{https://github.com/PeiranLi0930/emotionnet}},
}
```
### License
L-SVD is released under the [BSD-3-Clause license](LICENSE).
### Contact
For support or further inquiries, please contact us at [pli258@wisc.edu](mailto:pli258@wisc.edu).
### Acknowledgments
We acknowledge the collective efforts of all contributors from the University of Wisconsin Madison's Computer Science Department and the global research community. Your insights and contributions are shaping the future of emotion recognition technology. |
davidberenstein1957/distilabel-archangel-children-dpo | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 1147439
num_examples: 1000
download_size: 663008
dataset_size: 1147439
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SDbiaseval/identities | ---
dataset_info:
features:
- name: ethnicity
dtype: string
- name: gender
dtype: string
- name: 'no'
dtype: int32
- name: image_path
dtype: string
- name: image
dtype: image
- name: model
dtype: string
splits:
- name: train
num_bytes: 585336673.0
num_examples: 2040
download_size: 465986042
dataset_size: 585336673.0
---
# Dataset Card for "identities"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yleo/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966694
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pravsels/Manim-Tutorials-2021_brianamedee_code | ---
dataset_info:
features:
- name: file_path
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 77899
num_examples: 9
download_size: 25161
dataset_size: 77899
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/cnoc_na_riabh_yaraan_doo_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of cnoc_na_riabh_yaraan_doo/ノクナレア・ヤラアーンドゥ/诺克娜蕾·雅兰杜 (Fate/Grand Order)
This is the dataset of cnoc_na_riabh_yaraan_doo/ノクナレア・ヤラアーンドゥ/诺克娜蕾·雅兰杜 (Fate/Grand Order), containing 41 images and their tags.
The core tags of this character are `long_hair, pink_hair, yellow_eyes, breasts, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 41 | 55.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cnoc_na_riabh_yaraan_doo_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 41 | 47.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cnoc_na_riabh_yaraan_doo_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 80 | 82.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cnoc_na_riabh_yaraan_doo_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cnoc_na_riabh_yaraan_doo_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------|
| 0 | 41 |  |  |  |  |  | 1girl, looking_at_viewer, smile, tiara, solo, black_bikini |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | tiara | solo | black_bikini |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:--------|:-------|:---------------|
| 0 | 41 |  |  |  |  |  | X | X | X | X | X | X |
|
abinthomasonline/stained-glass | ---
license: mit
task_categories:
- text-to-image
- image-to-image
- unconditional-image-generation
tags:
- art
pretty_name: stained
size_categories:
- n<1K
---
# Stained Glass Art Dataset for Diffusion Models
## Overview
This dataset consists of 21 high-resolution images of stained glass art, accompanied by corresponding captions. It is designed for fine-tuning diffusion models using techniques such as textual inversion and dreambooth. The dataset is intended to facilitate research and experimentation in generating stained glass art-inspired images.
## Dataset Structure
- **Images:** The stained glass art images are stored in the "images" directory, with filenames ranging from "0.jpg" to "20.jpg."
- **Captions:** Captions for each image are provided in the "captions.csv" file located in the dataset's root directory. The captions contain placeholders for adjectives and a custom token to represent stained glass art. For example: "A {adjective} {token} of a puppy."
- **Adjective Placeholders:** During training, the {adjective} placeholder in the captions can be randomly selected from the following list: `["", "good", "cropped", "clean", "bright", "cool", "nice", "small", "large", "dark", "weird"]`.
- **Token Placeholder:** The {token} placeholder represents the custom token that needs to be trained to capture the unique art style of stained glass. This token is a key element in generating realistic stained glass art-inspired images.
|
Apocalypse-19/amazon-shoes | ---
license: mit
---
|
Aerobotics/belly-angle-selection-5K | ---
dataset_info:
features:
- name: image
dtype: image
- name: 'Unnamed: 0'
dtype: int64
- name: label
dtype: float64
- name: belly_angle
dtype: float64
- name: class_label
dtype: string
- name: annotation_task_id
dtype: float64
- name: s3_path_to_input_image
dtype: string
- name: s3_path_to_output_annotations_geojson
dtype: string
- name: fruit_annotation_id
dtype: float64
- name: confidence
dtype: float64
- name: area_px2
dtype: float64
- name: cam_capture_id
dtype: float64
- name: fruit_finding_outputs_id
dtype: float64
- name: ml_model_version_id
dtype: float64
- name: cam_capture_group_id
dtype: float64
- name: phone_model
dtype: string
- name: week_of_year
dtype: float64
- name: year
dtype: float64
- name: orchard_id
dtype: float64
- name: hectares
dtype: float64
- name: orchard_name
dtype: string
- name: crop_type_name
dtype: string
- name: crop_type_id
dtype: float64
- name: cultivar_name
dtype: string
- name: cultivar_id
dtype: float64
- name: farm_id
dtype: float64
- name: farm_name
dtype: string
- name: client_id
dtype: float64
- name: grouping
dtype: string
- name: farm_region
dtype: string
- name: valid_axis_insight
dtype: bool
splits:
- name: train
num_bytes: 37599285.072
num_examples: 4859
download_size: 39084964
dataset_size: 37599285.072
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Francesco/csgo-videogame | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': CSGO
'1': CT
'2': T
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: csgo-videogame
tags:
- rf100
---
# Dataset Card for csgo-videogame
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/csgo-videogame
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
csgo-videogame
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/csgo-videogame
### Citation Information
```
@misc{ csgo-videogame,
title = { csgo videogame Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/csgo-videogame } },
url = { https://universe.roboflow.com/object-detection/csgo-videogame },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_31 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 803599072.0
num_examples: 157816
download_size: 818437804
dataset_size: 803599072.0
---
# Dataset Card for "chunk_31"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
phanvancongthanh/enamine_np_standardized | ---
dataset_info:
features:
- name: smiles
dtype: string
splits:
- name: train
num_bytes: 2822585600
num_examples: 48585889
download_size: 968794571
dataset_size: 2822585600
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "enamine_np_standardized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kyueran/cond-mat | ---
dataset_info:
features:
- name: sentences
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 19418581
num_examples: 23499
- name: validation
num_bytes: 2140888
num_examples: 2612
download_size: 12693386
dataset_size: 21559469
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
cahya/instructions-ms | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 18889802.670684632
num_examples: 40115
- name: test
num_bytes: 497261.16465768346
num_examples: 1056
- name: validation
num_bytes: 497261.16465768346
num_examples: 1056
download_size: 10544795
dataset_size: 19884324.999999996
---
# Dataset Card for "instructions-ms"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Thermostatic/aritmetica_basic | ---
license: mit
---
|
johnny9210/instruction_019 | ---
license: apache-2.0
task_categories:
- question-answering
--- |
PiyushLavaniya/Alpaca_Instruct_Processed_train_ready | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 93680964.0
num_examples: 46800
- name: test
num_bytes: 10408996.0
num_examples: 5200
download_size: 32202704
dataset_size: 104089960.0
---
# Dataset Card for "Alpaca_Instruct_Processed_train_ready"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ravindrakinagi/new_abs_data | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Asap7772/Flatten-Math-Shepherd_0.8_12.0_-2.0_True | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: next_prompt
dtype: string
- name: next_response
dtype: string
- name: label
dtype: string
- name: question
dtype: string
- name: step
dtype: int64
- name: trajectory
dtype: int64
- name: mask
dtype: int64
- name: reward
dtype: float64
- name: mc_values
dtype: float64
splits:
- name: train
num_bytes: 4279469183
num_examples: 2482945
- name: test
num_bytes: 491798737
num_examples: 283159
download_size: 880086064
dataset_size: 4771267920
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Yamei/TVCG_NER | ---
dataset_info:
features:
- name: entities
sequence:
sequence: string
splits:
- name: train
num_bytes: 23659235
num_examples: 33012
download_size: 8412973
dataset_size: 23659235
---
# Dataset Card for "TVCG_NER"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.